Dec 04 09:42:33 crc systemd[1]: Starting Kubernetes Kubelet... Dec 04 09:42:33 crc restorecon[4692]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:33 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:42:34 crc restorecon[4692]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:42:34 crc restorecon[4692]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 04 09:42:34 crc kubenswrapper[4693]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 09:42:34 crc kubenswrapper[4693]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 04 09:42:34 crc kubenswrapper[4693]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 09:42:34 crc kubenswrapper[4693]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 09:42:34 crc kubenswrapper[4693]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 04 09:42:34 crc kubenswrapper[4693]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.202361 4693 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.205878 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.205905 4693 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.205911 4693 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.205915 4693 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.205920 4693 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.205925 4693 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.205941 4693 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.205946 4693 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.205950 4693 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.205955 4693 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.205959 4693 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.205964 4693 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.205970 4693 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.205975 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.205980 4693 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.205985 4693 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.205990 4693 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.205995 4693 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206000 4693 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206005 4693 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206009 4693 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206014 4693 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206018 4693 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206022 4693 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206028 4693 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206035 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206040 4693 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206045 4693 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206050 4693 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206055 4693 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206060 4693 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206065 4693 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206070 4693 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206076 4693 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206082 4693 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206087 4693 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206091 4693 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206097 4693 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206102 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206106 4693 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206111 4693 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206115 4693 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206120 4693 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206124 4693 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206128 4693 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206133 4693 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206138 4693 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206143 4693 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206148 4693 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206152 4693 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206157 4693 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206161 4693 feature_gate.go:330] unrecognized feature gate: Example Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206166 4693 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206170 4693 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206174 4693 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206179 4693 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206183 4693 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206188 4693 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206193 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206199 4693 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206204 4693 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206210 4693 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206214 4693 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206221 4693 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206229 4693 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206234 4693 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206255 4693 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206261 4693 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206266 4693 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206270 4693 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.206275 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206645 4693 flags.go:64] FLAG: --address="0.0.0.0" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206663 4693 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206675 4693 flags.go:64] FLAG: --anonymous-auth="true" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206682 4693 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206689 4693 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206695 4693 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206704 4693 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206711 4693 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206718 4693 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206724 4693 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206730 4693 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206736 4693 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206741 4693 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206747 4693 flags.go:64] FLAG: --cgroup-root="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206753 4693 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206758 4693 flags.go:64] FLAG: --client-ca-file="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206763 4693 flags.go:64] FLAG: --cloud-config="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206769 4693 flags.go:64] FLAG: --cloud-provider="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206773 4693 flags.go:64] FLAG: --cluster-dns="[]" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206781 4693 flags.go:64] FLAG: --cluster-domain="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206786 4693 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206791 4693 flags.go:64] FLAG: --config-dir="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206797 4693 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206802 4693 flags.go:64] FLAG: --container-log-max-files="5" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206810 4693 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206815 4693 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206820 4693 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206827 4693 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206832 4693 flags.go:64] FLAG: --contention-profiling="false" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206837 4693 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206843 4693 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206848 4693 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206853 4693 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206860 4693 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206880 4693 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206885 4693 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206890 4693 flags.go:64] FLAG: --enable-load-reader="false" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206895 4693 flags.go:64] FLAG: --enable-server="true" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206901 4693 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206908 4693 flags.go:64] FLAG: --event-burst="100" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206914 4693 flags.go:64] FLAG: --event-qps="50" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206920 4693 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206925 4693 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206930 4693 flags.go:64] FLAG: --eviction-hard="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206937 4693 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206943 4693 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206948 4693 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206954 4693 flags.go:64] FLAG: --eviction-soft="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206959 4693 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206964 4693 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206969 4693 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206974 4693 flags.go:64] FLAG: --experimental-mounter-path="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206979 4693 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206985 4693 flags.go:64] FLAG: --fail-swap-on="true" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206990 4693 flags.go:64] FLAG: --feature-gates="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.206997 4693 flags.go:64] FLAG: --file-check-frequency="20s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207002 4693 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207008 4693 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207013 4693 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207019 4693 flags.go:64] FLAG: --healthz-port="10248" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207024 4693 flags.go:64] FLAG: --help="false" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207029 4693 flags.go:64] FLAG: --hostname-override="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207034 4693 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207039 4693 flags.go:64] FLAG: --http-check-frequency="20s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207044 4693 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207050 4693 flags.go:64] FLAG: --image-credential-provider-config="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207055 4693 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207060 4693 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207065 4693 flags.go:64] FLAG: --image-service-endpoint="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207071 4693 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207078 4693 flags.go:64] FLAG: --kube-api-burst="100" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207083 4693 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207089 4693 flags.go:64] FLAG: --kube-api-qps="50" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207094 4693 flags.go:64] FLAG: --kube-reserved="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207099 4693 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207104 4693 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207109 4693 flags.go:64] FLAG: --kubelet-cgroups="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207115 4693 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207120 4693 flags.go:64] FLAG: --lock-file="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207125 4693 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207131 4693 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207136 4693 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207144 4693 flags.go:64] FLAG: --log-json-split-stream="false" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207149 4693 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207154 4693 flags.go:64] FLAG: --log-text-split-stream="false" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207160 4693 flags.go:64] FLAG: --logging-format="text" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207165 4693 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207170 4693 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207175 4693 flags.go:64] FLAG: --manifest-url="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207180 4693 flags.go:64] FLAG: --manifest-url-header="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207188 4693 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207194 4693 flags.go:64] FLAG: --max-open-files="1000000" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207200 4693 flags.go:64] FLAG: --max-pods="110" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207206 4693 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207211 4693 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207216 4693 flags.go:64] FLAG: --memory-manager-policy="None" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207221 4693 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207227 4693 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207232 4693 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207237 4693 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207250 4693 flags.go:64] FLAG: --node-status-max-images="50" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207255 4693 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207262 4693 flags.go:64] FLAG: --oom-score-adj="-999" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207267 4693 flags.go:64] FLAG: --pod-cidr="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207273 4693 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207282 4693 flags.go:64] FLAG: --pod-manifest-path="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207287 4693 flags.go:64] FLAG: --pod-max-pids="-1" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207293 4693 flags.go:64] FLAG: --pods-per-core="0" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207298 4693 flags.go:64] FLAG: --port="10250" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207303 4693 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207309 4693 flags.go:64] FLAG: --provider-id="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207314 4693 flags.go:64] FLAG: --qos-reserved="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207319 4693 flags.go:64] FLAG: --read-only-port="10255" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207325 4693 flags.go:64] FLAG: --register-node="true" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207347 4693 flags.go:64] FLAG: --register-schedulable="true" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207353 4693 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207364 4693 flags.go:64] FLAG: --registry-burst="10" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207369 4693 flags.go:64] FLAG: --registry-qps="5" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207374 4693 flags.go:64] FLAG: --reserved-cpus="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207379 4693 flags.go:64] FLAG: --reserved-memory="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207386 4693 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207392 4693 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207398 4693 flags.go:64] FLAG: --rotate-certificates="false" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207403 4693 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207407 4693 flags.go:64] FLAG: --runonce="false" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207413 4693 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207418 4693 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207424 4693 flags.go:64] FLAG: --seccomp-default="false" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207429 4693 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207433 4693 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207438 4693 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207444 4693 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207450 4693 flags.go:64] FLAG: --storage-driver-password="root" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207455 4693 flags.go:64] FLAG: --storage-driver-secure="false" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207460 4693 flags.go:64] FLAG: --storage-driver-table="stats" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207465 4693 flags.go:64] FLAG: --storage-driver-user="root" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207471 4693 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207476 4693 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207481 4693 flags.go:64] FLAG: --system-cgroups="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207486 4693 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207495 4693 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207499 4693 flags.go:64] FLAG: --tls-cert-file="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207504 4693 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207511 4693 flags.go:64] FLAG: --tls-min-version="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207516 4693 flags.go:64] FLAG: --tls-private-key-file="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207522 4693 flags.go:64] FLAG: --topology-manager-policy="none" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207526 4693 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207532 4693 flags.go:64] FLAG: --topology-manager-scope="container" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207537 4693 flags.go:64] FLAG: --v="2" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207543 4693 flags.go:64] FLAG: --version="false" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207550 4693 flags.go:64] FLAG: --vmodule="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207556 4693 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.207562 4693 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207678 4693 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207685 4693 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207690 4693 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207695 4693 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207699 4693 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207704 4693 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207710 4693 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207716 4693 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207722 4693 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207727 4693 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207733 4693 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207737 4693 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207742 4693 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207748 4693 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207754 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207758 4693 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207763 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207768 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207773 4693 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207777 4693 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207804 4693 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207809 4693 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207813 4693 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207826 4693 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207830 4693 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207835 4693 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207839 4693 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207843 4693 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207849 4693 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207855 4693 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207860 4693 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207865 4693 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207870 4693 feature_gate.go:330] unrecognized feature gate: Example Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.207927 4693 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208016 4693 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208022 4693 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208029 4693 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208034 4693 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208050 4693 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208055 4693 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208059 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208064 4693 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208069 4693 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208073 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208078 4693 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208084 4693 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208092 4693 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208097 4693 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208102 4693 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208108 4693 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208114 4693 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208120 4693 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208124 4693 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208129 4693 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208134 4693 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208139 4693 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208143 4693 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208148 4693 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208152 4693 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208156 4693 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208160 4693 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208165 4693 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208169 4693 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208178 4693 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208182 4693 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208187 4693 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208192 4693 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208196 4693 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208201 4693 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208206 4693 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.208210 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.208224 4693 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.216810 4693 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.216852 4693 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217713 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217743 4693 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217750 4693 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217755 4693 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217760 4693 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217765 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217770 4693 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217777 4693 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217782 4693 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217787 4693 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217792 4693 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217798 4693 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217802 4693 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217807 4693 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217812 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217817 4693 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217821 4693 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217830 4693 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217838 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217843 4693 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217850 4693 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217855 4693 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217860 4693 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217865 4693 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217870 4693 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217874 4693 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217880 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217887 4693 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217892 4693 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217897 4693 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217901 4693 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217906 4693 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217910 4693 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217915 4693 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217920 4693 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217927 4693 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217935 4693 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217942 4693 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217950 4693 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217955 4693 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217962 4693 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217967 4693 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217973 4693 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217978 4693 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217983 4693 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217988 4693 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217994 4693 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.217998 4693 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218004 4693 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218008 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218014 4693 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218019 4693 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218024 4693 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218028 4693 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218033 4693 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218037 4693 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218042 4693 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218047 4693 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218051 4693 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218056 4693 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218063 4693 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218070 4693 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218074 4693 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218079 4693 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218083 4693 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218088 4693 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218093 4693 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218097 4693 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218103 4693 feature_gate.go:330] unrecognized feature gate: Example Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218107 4693 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218113 4693 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.218123 4693 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218321 4693 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218352 4693 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218358 4693 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218364 4693 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218369 4693 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218374 4693 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218380 4693 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218384 4693 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218389 4693 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218394 4693 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218399 4693 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218404 4693 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218409 4693 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218414 4693 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218418 4693 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218423 4693 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218428 4693 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218432 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218437 4693 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218443 4693 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218451 4693 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218456 4693 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218461 4693 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218467 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218472 4693 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218476 4693 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218481 4693 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218485 4693 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218490 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218495 4693 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218500 4693 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218505 4693 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218510 4693 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218515 4693 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218521 4693 feature_gate.go:330] unrecognized feature gate: Example Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218525 4693 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218530 4693 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218536 4693 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218543 4693 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218549 4693 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218554 4693 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218560 4693 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218565 4693 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218571 4693 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218576 4693 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218580 4693 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218585 4693 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218589 4693 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218594 4693 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218598 4693 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218603 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218607 4693 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218612 4693 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218617 4693 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218621 4693 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218626 4693 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218630 4693 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218635 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218640 4693 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218644 4693 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218649 4693 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218654 4693 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218659 4693 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218664 4693 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218669 4693 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218674 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218679 4693 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218683 4693 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218688 4693 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218692 4693 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.218697 4693 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.218705 4693 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.218921 4693 server.go:940] "Client rotation is on, will bootstrap in background" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.221938 4693 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.222028 4693 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.222550 4693 server.go:997] "Starting client certificate rotation" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.222574 4693 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.222887 4693 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-24 21:24:35.620044878 +0000 UTC Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.222990 4693 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.227218 4693 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.228633 4693 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 09:42:34 crc kubenswrapper[4693]: E1204 09:42:34.228846 4693 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.237755 4693 log.go:25] "Validated CRI v1 runtime API" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.249370 4693 log.go:25] "Validated CRI v1 image API" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.251781 4693 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.253933 4693 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-04-09-37-40-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.253966 4693 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.272941 4693 manager.go:217] Machine: {Timestamp:2025-12-04 09:42:34.271733954 +0000 UTC m=+0.169327707 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:72745984-4f49-4edf-ab3d-6eef46afb67c BootID:1bc6d7e6-1878-49d8-8149-caf2b9e5b427 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:bd:65:20 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:bd:65:20 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:15:ca:83 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:cc:a8:79 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:48:a2:6a Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:68:e7:cd Speed:-1 Mtu:1496} {Name:eth10 MacAddress:76:31:05:b5:a4:4d Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:16:1a:1c:fd:65:60 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.273439 4693 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.273643 4693 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.274222 4693 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.274488 4693 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.274578 4693 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.274825 4693 topology_manager.go:138] "Creating topology manager with none policy" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.274893 4693 container_manager_linux.go:303] "Creating device plugin manager" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.275108 4693 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.275203 4693 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.275816 4693 state_mem.go:36] "Initialized new in-memory state store" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.275982 4693 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.277361 4693 kubelet.go:418] "Attempting to sync node with API server" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.277447 4693 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.277552 4693 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.277614 4693 kubelet.go:324] "Adding apiserver pod source" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.277671 4693 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.279170 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Dec 04 09:42:34 crc kubenswrapper[4693]: E1204 09:42:34.279272 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.279552 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Dec 04 09:42:34 crc kubenswrapper[4693]: E1204 09:42:34.279634 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.279743 4693 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.280112 4693 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.280814 4693 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.281272 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.281296 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.281306 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.281314 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.281340 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.281349 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.281357 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.281370 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.281381 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.281390 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.281401 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.281409 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.281621 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.282147 4693 server.go:1280] "Started kubelet" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.282534 4693 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.282566 4693 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.282538 4693 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.283285 4693 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 04 09:42:34 crc systemd[1]: Started Kubernetes Kubelet. Dec 04 09:42:34 crc kubenswrapper[4693]: E1204 09:42:34.284170 4693 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.195:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187df9dc76e7905e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 09:42:34.28211107 +0000 UTC m=+0.179704833,LastTimestamp:2025-12-04 09:42:34.28211107 +0000 UTC m=+0.179704833,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.284703 4693 server.go:460] "Adding debug handlers to kubelet server" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.285496 4693 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.285556 4693 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.286156 4693 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.286176 4693 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.286278 4693 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 04 09:42:34 crc kubenswrapper[4693]: E1204 09:42:34.286706 4693 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.287080 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Dec 04 09:42:34 crc kubenswrapper[4693]: E1204 09:42:34.287147 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.287318 4693 factory.go:55] Registering systemd factory Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.286618 4693 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 22:48:34.582573949 +0000 UTC Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.287376 4693 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 37h6m0.295212324s for next certificate rotation Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.287350 4693 factory.go:221] Registration of the systemd container factory successfully Dec 04 09:42:34 crc kubenswrapper[4693]: E1204 09:42:34.287519 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="200ms" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.288205 4693 factory.go:153] Registering CRI-O factory Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.288224 4693 factory.go:221] Registration of the crio container factory successfully Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.288282 4693 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.288302 4693 factory.go:103] Registering Raw factory Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.288314 4693 manager.go:1196] Started watching for new ooms in manager Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.288940 4693 manager.go:319] Starting recovery of all containers Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306103 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306185 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306198 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306207 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306220 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306229 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306239 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306254 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306265 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306275 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306285 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306294 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306303 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306315 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306324 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306383 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306396 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306408 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306419 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306431 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306466 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306477 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.306488 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310063 4693 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310124 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310145 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310162 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310205 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310240 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310256 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310269 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310282 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310295 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310309 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310325 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310376 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310392 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310405 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310418 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310431 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310445 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310458 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310471 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310483 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310495 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310509 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310522 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310539 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310555 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310569 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310584 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310600 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310615 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310634 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310657 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310682 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310698 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310713 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310727 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310742 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310760 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310776 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310790 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310804 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310823 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310839 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310853 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310871 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310886 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310901 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310922 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310936 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310950 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310964 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310979 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.310992 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311007 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311021 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311035 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311051 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311065 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311078 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311092 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311108 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311122 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311136 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311150 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311164 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311178 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311191 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311204 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311218 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311231 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311251 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311266 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311281 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311295 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311308 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311321 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311349 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311364 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311378 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311390 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311481 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311496 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311518 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311533 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311548 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311562 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311576 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311593 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311608 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311623 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311639 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311661 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311675 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311691 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311706 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311740 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311755 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311770 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311786 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311801 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311815 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311829 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311844 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311859 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311873 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311887 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311901 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311915 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311926 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311940 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311953 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311966 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311978 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.311990 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312001 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312012 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312023 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312035 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312045 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312056 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312066 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312079 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312091 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312102 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312114 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312138 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312155 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312171 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312187 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312201 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312213 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312224 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312237 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312255 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312269 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312285 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312301 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312313 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312325 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312358 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312369 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312388 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312400 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312412 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312430 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312442 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312454 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312467 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312479 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312490 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312501 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312513 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312529 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312542 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312558 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312569 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312584 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312596 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312607 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312620 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312631 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312648 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312663 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312675 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312687 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312697 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312707 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312719 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312735 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312746 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312758 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312771 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312781 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312794 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312807 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312823 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312835 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312882 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312894 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312909 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312921 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312934 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312947 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312961 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312979 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312990 4693 reconstruct.go:97] "Volume reconstruction finished" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.312999 4693 reconciler.go:26] "Reconciler: start to sync state" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.313288 4693 manager.go:324] Recovery completed Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.322359 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.323790 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.323830 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.323839 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.324642 4693 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.324662 4693 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.324679 4693 state_mem.go:36] "Initialized new in-memory state store" Dec 04 09:42:34 crc kubenswrapper[4693]: E1204 09:42:34.386971 4693 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.447805 4693 policy_none.go:49] "None policy: Start" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.450579 4693 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.450613 4693 state_mem.go:35] "Initializing new in-memory state store" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.457555 4693 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.459885 4693 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.459923 4693 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.459951 4693 kubelet.go:2335] "Starting kubelet main sync loop" Dec 04 09:42:34 crc kubenswrapper[4693]: E1204 09:42:34.460053 4693 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.461988 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Dec 04 09:42:34 crc kubenswrapper[4693]: E1204 09:42:34.462460 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:42:34 crc kubenswrapper[4693]: E1204 09:42:34.487224 4693 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 09:42:34 crc kubenswrapper[4693]: E1204 09:42:34.488380 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="400ms" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.497377 4693 manager.go:334] "Starting Device Plugin manager" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.497556 4693 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.497574 4693 server.go:79] "Starting device plugin registration server" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.498002 4693 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.498030 4693 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.498511 4693 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.498604 4693 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.498619 4693 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 04 09:42:34 crc kubenswrapper[4693]: E1204 09:42:34.508266 4693 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.560511 4693 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.560610 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.562227 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.562272 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.562282 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.562447 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.562968 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.563042 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.563276 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.563319 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.563350 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.563528 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.563658 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.563700 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.564136 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.564165 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.564177 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.564768 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.564824 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.564853 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.564862 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.564933 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.564991 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.565080 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.565096 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.565245 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.565550 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.565576 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.565588 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.565694 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.565757 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.565797 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.566273 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.566296 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.566308 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.566513 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.566539 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.566549 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.566520 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.566585 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.566597 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.566783 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.566814 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.567462 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.567484 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.567497 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.598554 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.599800 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.599852 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.599862 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.599889 4693 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 09:42:34 crc kubenswrapper[4693]: E1204 09:42:34.600416 4693 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.617161 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.617196 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.617221 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.617246 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.617263 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.617279 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.617324 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.617374 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.617398 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.617414 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.617433 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.617448 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.617465 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.617495 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.617510 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718481 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718546 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718568 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718583 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718611 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718636 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718656 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718674 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718675 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718714 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718751 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718759 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718732 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718706 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718726 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718671 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718695 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718817 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718894 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718940 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718956 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.718999 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.719007 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.719031 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.719039 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.719076 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.719115 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.719182 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.719196 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.719218 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.801122 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.802705 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.802760 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.802786 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.802831 4693 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 09:42:34 crc kubenswrapper[4693]: E1204 09:42:34.803520 4693 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Dec 04 09:42:34 crc kubenswrapper[4693]: E1204 09:42:34.889152 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="800ms" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.909550 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.916526 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.937548 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.942830 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.945629 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-db61b8c93e6c39cfe85e1c4ffe5b9617e6b557cba8fc3e07173627b87b3b73aa WatchSource:0}: Error finding container db61b8c93e6c39cfe85e1c4ffe5b9617e6b557cba8fc3e07173627b87b3b73aa: Status 404 returned error can't find the container with id db61b8c93e6c39cfe85e1c4ffe5b9617e6b557cba8fc3e07173627b87b3b73aa Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.950181 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-32985e7a2494e4cde807a600b5f5ab2ac5e8aa7f1498943878906876d91e4fd1 WatchSource:0}: Error finding container 32985e7a2494e4cde807a600b5f5ab2ac5e8aa7f1498943878906876d91e4fd1: Status 404 returned error can't find the container with id 32985e7a2494e4cde807a600b5f5ab2ac5e8aa7f1498943878906876d91e4fd1 Dec 04 09:42:34 crc kubenswrapper[4693]: I1204 09:42:34.957674 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.965376 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-d97fd779374a0346b13c4ae36c96117b036dfbb5a06a83e8e475ec675cdf3c27 WatchSource:0}: Error finding container d97fd779374a0346b13c4ae36c96117b036dfbb5a06a83e8e475ec675cdf3c27: Status 404 returned error can't find the container with id d97fd779374a0346b13c4ae36c96117b036dfbb5a06a83e8e475ec675cdf3c27 Dec 04 09:42:34 crc kubenswrapper[4693]: W1204 09:42:34.967485 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-c7e9c2994fd43e959e368c3311ebe51d2d3e47766c23c57ae68163c048254809 WatchSource:0}: Error finding container c7e9c2994fd43e959e368c3311ebe51d2d3e47766c23c57ae68163c048254809: Status 404 returned error can't find the container with id c7e9c2994fd43e959e368c3311ebe51d2d3e47766c23c57ae68163c048254809 Dec 04 09:42:35 crc kubenswrapper[4693]: W1204 09:42:35.191049 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-6e9a77146ee92e9492502ca25a295d350bc617dfff475355534475797b10898f WatchSource:0}: Error finding container 6e9a77146ee92e9492502ca25a295d350bc617dfff475355534475797b10898f: Status 404 returned error can't find the container with id 6e9a77146ee92e9492502ca25a295d350bc617dfff475355534475797b10898f Dec 04 09:42:35 crc kubenswrapper[4693]: I1204 09:42:35.203974 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:35 crc kubenswrapper[4693]: I1204 09:42:35.204968 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:35 crc kubenswrapper[4693]: I1204 09:42:35.205046 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:35 crc kubenswrapper[4693]: I1204 09:42:35.205071 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:35 crc kubenswrapper[4693]: I1204 09:42:35.205129 4693 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 09:42:35 crc kubenswrapper[4693]: E1204 09:42:35.205860 4693 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Dec 04 09:42:35 crc kubenswrapper[4693]: I1204 09:42:35.284123 4693 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Dec 04 09:42:35 crc kubenswrapper[4693]: I1204 09:42:35.466023 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"db61b8c93e6c39cfe85e1c4ffe5b9617e6b557cba8fc3e07173627b87b3b73aa"} Dec 04 09:42:35 crc kubenswrapper[4693]: I1204 09:42:35.541367 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6e9a77146ee92e9492502ca25a295d350bc617dfff475355534475797b10898f"} Dec 04 09:42:35 crc kubenswrapper[4693]: I1204 09:42:35.542229 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c7e9c2994fd43e959e368c3311ebe51d2d3e47766c23c57ae68163c048254809"} Dec 04 09:42:35 crc kubenswrapper[4693]: I1204 09:42:35.542940 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d97fd779374a0346b13c4ae36c96117b036dfbb5a06a83e8e475ec675cdf3c27"} Dec 04 09:42:35 crc kubenswrapper[4693]: I1204 09:42:35.543619 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"32985e7a2494e4cde807a600b5f5ab2ac5e8aa7f1498943878906876d91e4fd1"} Dec 04 09:42:35 crc kubenswrapper[4693]: W1204 09:42:35.613946 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Dec 04 09:42:35 crc kubenswrapper[4693]: E1204 09:42:35.614267 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:42:35 crc kubenswrapper[4693]: W1204 09:42:35.617995 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Dec 04 09:42:35 crc kubenswrapper[4693]: E1204 09:42:35.618034 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:42:35 crc kubenswrapper[4693]: E1204 09:42:35.690709 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="1.6s" Dec 04 09:42:35 crc kubenswrapper[4693]: W1204 09:42:35.728286 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Dec 04 09:42:35 crc kubenswrapper[4693]: E1204 09:42:35.728397 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:42:35 crc kubenswrapper[4693]: W1204 09:42:35.985405 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Dec 04 09:42:35 crc kubenswrapper[4693]: E1204 09:42:35.985499 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.006006 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.007231 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.007271 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.007281 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.007307 4693 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 09:42:36 crc kubenswrapper[4693]: E1204 09:42:36.007791 4693 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.284122 4693 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.392431 4693 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 04 09:42:36 crc kubenswrapper[4693]: E1204 09:42:36.396312 4693 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.551812 4693 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="45628d6104ee44436c1e2aabc583cb4c106b24033a1546449822c5808a547420" exitCode=0 Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.552127 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.553168 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"45628d6104ee44436c1e2aabc583cb4c106b24033a1546449822c5808a547420"} Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.554066 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.554135 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.554157 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.555770 4693 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="40ffcd3f6377ac229295a0ad94b1fad9b43f286d12368ca57747c142402d59da" exitCode=0 Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.555844 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"40ffcd3f6377ac229295a0ad94b1fad9b43f286d12368ca57747c142402d59da"} Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.555893 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.557169 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.557221 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.557245 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.557937 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bf86e2bd0e3a8a381534887778c1c978c95f7075a924928163e4c11ec30ee13d"} Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.559867 4693 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993" exitCode=0 Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.559923 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993"} Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.559993 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.561144 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.561192 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.561211 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.562364 4693 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3b090dd2f29112051f6d01d73cb4a15b31c0ac5720b8e1ed25acf3cf1717391b" exitCode=0 Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.562417 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3b090dd2f29112051f6d01d73cb4a15b31c0ac5720b8e1ed25acf3cf1717391b"} Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.562474 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.563440 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.563471 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.563482 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.564431 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.565658 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.565715 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:36 crc kubenswrapper[4693]: I1204 09:42:36.565734 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:37 crc kubenswrapper[4693]: I1204 09:42:37.283428 4693 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Dec 04 09:42:37 crc kubenswrapper[4693]: E1204 09:42:37.291881 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="3.2s" Dec 04 09:42:37 crc kubenswrapper[4693]: I1204 09:42:37.608062 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:37 crc kubenswrapper[4693]: W1204 09:42:37.613093 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Dec 04 09:42:37 crc kubenswrapper[4693]: E1204 09:42:37.613174 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:42:37 crc kubenswrapper[4693]: W1204 09:42:37.849573 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Dec 04 09:42:37 crc kubenswrapper[4693]: E1204 09:42:37.849651 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:42:37 crc kubenswrapper[4693]: I1204 09:42:37.960820 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:37 crc kubenswrapper[4693]: I1204 09:42:37.960883 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:37 crc kubenswrapper[4693]: I1204 09:42:37.960908 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:37 crc kubenswrapper[4693]: I1204 09:42:37.960954 4693 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 09:42:37 crc kubenswrapper[4693]: E1204 09:42:37.961604 4693 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.195:6443: connect: connection refused" node="crc" Dec 04 09:42:38 crc kubenswrapper[4693]: W1204 09:42:38.027083 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Dec 04 09:42:38 crc kubenswrapper[4693]: E1204 09:42:38.027194 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:42:38 crc kubenswrapper[4693]: I1204 09:42:38.284556 4693 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Dec 04 09:42:38 crc kubenswrapper[4693]: W1204 09:42:38.306176 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.195:6443: connect: connection refused Dec 04 09:42:38 crc kubenswrapper[4693]: E1204 09:42:38.306272 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.195:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:42:38 crc kubenswrapper[4693]: I1204 09:42:38.968149 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"68f8f9fc1cae80cd0ad4be20d8efc260c5e201546591544d8bc7db5af6e9f48a"} Dec 04 09:42:38 crc kubenswrapper[4693]: I1204 09:42:38.970289 4693 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5830ed4a7ecf954de74ae938ee105cac4514c0ecd06245f46f0e0eaebcb5b783" exitCode=0 Dec 04 09:42:38 crc kubenswrapper[4693]: I1204 09:42:38.970395 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5830ed4a7ecf954de74ae938ee105cac4514c0ecd06245f46f0e0eaebcb5b783"} Dec 04 09:42:38 crc kubenswrapper[4693]: I1204 09:42:38.970426 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:38 crc kubenswrapper[4693]: I1204 09:42:38.971379 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:38 crc kubenswrapper[4693]: I1204 09:42:38.971407 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:38 crc kubenswrapper[4693]: I1204 09:42:38.971417 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:38 crc kubenswrapper[4693]: I1204 09:42:38.972948 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:38 crc kubenswrapper[4693]: I1204 09:42:38.972940 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a3526797d4274be8717dc7130559a583a9c9891e48acdf43fbe03ee88c005cde"} Dec 04 09:42:38 crc kubenswrapper[4693]: I1204 09:42:38.973874 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:38 crc kubenswrapper[4693]: I1204 09:42:38.973897 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:38 crc kubenswrapper[4693]: I1204 09:42:38.973906 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:38 crc kubenswrapper[4693]: I1204 09:42:38.978874 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ab407d49238d3e272e50686183e061dd01c1fb4b8648dd942dfe7464968568f4"} Dec 04 09:42:38 crc kubenswrapper[4693]: I1204 09:42:38.982657 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6b5a8da7097c8c5f85b565878a38bd5ebe517984ac246d31aae0aeb5926f3396"} Dec 04 09:42:38 crc kubenswrapper[4693]: I1204 09:42:38.982683 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2515851626d197e8348ff2f68e056fb696f54e15b5154fd46dced26e416025ce"} Dec 04 09:42:39 crc kubenswrapper[4693]: I1204 09:42:39.989634 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ba70fba21706d5cd46c7264ae364f0e6d05bb6833ffa54262e88734a2f45cf81"} Dec 04 09:42:39 crc kubenswrapper[4693]: I1204 09:42:39.989697 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:39 crc kubenswrapper[4693]: I1204 09:42:39.989715 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e74641d799edb6a4f5e88804ac71bed808d23cfbacfc7a827a8ac3b3e35b910a"} Dec 04 09:42:39 crc kubenswrapper[4693]: I1204 09:42:39.989737 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"04bdd206f1b3b49e2e20493e6447668c3de1675e8a8264a5d44cdda619862b81"} Dec 04 09:42:39 crc kubenswrapper[4693]: I1204 09:42:39.989753 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4fc5a089f9adfcecadb1c36dfb0e79952b135122b997f3d2caa36ff14c82d244"} Dec 04 09:42:39 crc kubenswrapper[4693]: I1204 09:42:39.990421 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:39 crc kubenswrapper[4693]: I1204 09:42:39.990443 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:39 crc kubenswrapper[4693]: I1204 09:42:39.990451 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:39 crc kubenswrapper[4693]: I1204 09:42:39.996786 4693 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="22325cd9dd4c02323f325a2a98a3e0158d717a457eb655fb162468532bd3a220" exitCode=0 Dec 04 09:42:39 crc kubenswrapper[4693]: I1204 09:42:39.996902 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:39 crc kubenswrapper[4693]: I1204 09:42:39.996903 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"22325cd9dd4c02323f325a2a98a3e0158d717a457eb655fb162468532bd3a220"} Dec 04 09:42:39 crc kubenswrapper[4693]: I1204 09:42:39.997906 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:39 crc kubenswrapper[4693]: I1204 09:42:39.997939 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:39 crc kubenswrapper[4693]: I1204 09:42:39.997949 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:40 crc kubenswrapper[4693]: I1204 09:42:40.000561 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:40 crc kubenswrapper[4693]: I1204 09:42:40.000543 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bee3d88aec29c2e8c6d66dbfa8e081c20ef6ad362a3f26c7ed4320d4bc89ad01"} Dec 04 09:42:40 crc kubenswrapper[4693]: I1204 09:42:40.000670 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"de2cf2b18fa7df405d3af7ea27b6cdbd0761badd6f34fc187d5aa664d5ef2848"} Dec 04 09:42:40 crc kubenswrapper[4693]: I1204 09:42:40.003599 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bcd2182c53830872a9a40d62001e8b81790a6edff1c4365952e4de15c9482828"} Dec 04 09:42:40 crc kubenswrapper[4693]: I1204 09:42:40.003660 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:40 crc kubenswrapper[4693]: I1204 09:42:40.003696 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:40 crc kubenswrapper[4693]: I1204 09:42:40.003889 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:40 crc kubenswrapper[4693]: I1204 09:42:40.003967 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:40 crc kubenswrapper[4693]: I1204 09:42:40.004027 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:40 crc kubenswrapper[4693]: I1204 09:42:40.004673 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:40 crc kubenswrapper[4693]: I1204 09:42:40.004801 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:40 crc kubenswrapper[4693]: I1204 09:42:40.004874 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:40 crc kubenswrapper[4693]: I1204 09:42:40.004919 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:40 crc kubenswrapper[4693]: I1204 09:42:40.005010 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:40 crc kubenswrapper[4693]: I1204 09:42:40.005028 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:40 crc kubenswrapper[4693]: I1204 09:42:40.108100 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:42:40 crc kubenswrapper[4693]: I1204 09:42:40.638566 4693 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.009618 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.009653 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b9f1d16b9cbed587bd5b012b61263fd66fadb83c402d7599916857619c2f47f7"} Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.009621 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.009799 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.009708 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"01a25248fb13fc62e65feb0b2266b74937d697d8c4aa7b33b8ba88855e91ae15"} Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.010357 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.010376 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.010781 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.010801 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.010810 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.012011 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.012053 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.012010 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.012070 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.012130 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.012168 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.162111 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.163121 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.163171 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.163181 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.163209 4693 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.381908 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:42:41 crc kubenswrapper[4693]: I1204 09:42:41.415953 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.014795 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6904f66951f941f5374c6ea55f5c6e8df79aaf6f2c16f044891414ec5201c581"} Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.014838 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b9d956276512f47cd540d03a6966813b34274f8300b7b98657838c95ee2c2765"} Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.014852 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c70d9c617b061628c7c2bf8978bbcb1210836830eb42c81c48de568bed9de7c9"} Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.014870 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.014876 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.014906 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.015584 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.016388 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.016418 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.016426 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.016488 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.016506 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.016506 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.016528 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.016423 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.016537 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.016545 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.016554 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.016514 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:42 crc kubenswrapper[4693]: I1204 09:42:42.503495 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 04 09:42:43 crc kubenswrapper[4693]: I1204 09:42:43.017882 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:43 crc kubenswrapper[4693]: I1204 09:42:43.019048 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:43 crc kubenswrapper[4693]: I1204 09:42:43.020228 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:43 crc kubenswrapper[4693]: I1204 09:42:43.020269 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:43 crc kubenswrapper[4693]: I1204 09:42:43.020280 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:43 crc kubenswrapper[4693]: I1204 09:42:43.020479 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:43 crc kubenswrapper[4693]: I1204 09:42:43.020516 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:43 crc kubenswrapper[4693]: I1204 09:42:43.020527 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:44 crc kubenswrapper[4693]: I1204 09:42:44.020534 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:44 crc kubenswrapper[4693]: I1204 09:42:44.021727 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:44 crc kubenswrapper[4693]: I1204 09:42:44.021776 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:44 crc kubenswrapper[4693]: I1204 09:42:44.021786 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:44 crc kubenswrapper[4693]: I1204 09:42:44.197541 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:42:44 crc kubenswrapper[4693]: I1204 09:42:44.197789 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:44 crc kubenswrapper[4693]: I1204 09:42:44.198932 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:44 crc kubenswrapper[4693]: I1204 09:42:44.198970 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:44 crc kubenswrapper[4693]: I1204 09:42:44.198982 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:44 crc kubenswrapper[4693]: I1204 09:42:44.382146 4693 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 09:42:44 crc kubenswrapper[4693]: I1204 09:42:44.382224 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 09:42:44 crc kubenswrapper[4693]: E1204 09:42:44.508695 4693 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 04 09:42:45 crc kubenswrapper[4693]: I1204 09:42:45.016324 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 04 09:42:45 crc kubenswrapper[4693]: I1204 09:42:45.022660 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:45 crc kubenswrapper[4693]: I1204 09:42:45.023892 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:45 crc kubenswrapper[4693]: I1204 09:42:45.023940 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:45 crc kubenswrapper[4693]: I1204 09:42:45.023952 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:47 crc kubenswrapper[4693]: I1204 09:42:47.205878 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:42:47 crc kubenswrapper[4693]: I1204 09:42:47.206137 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:47 crc kubenswrapper[4693]: I1204 09:42:47.207708 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:47 crc kubenswrapper[4693]: I1204 09:42:47.207775 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:47 crc kubenswrapper[4693]: I1204 09:42:47.207792 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:47 crc kubenswrapper[4693]: I1204 09:42:47.211925 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:42:48 crc kubenswrapper[4693]: I1204 09:42:48.028986 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:48 crc kubenswrapper[4693]: I1204 09:42:48.029239 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:42:48 crc kubenswrapper[4693]: I1204 09:42:48.030048 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:48 crc kubenswrapper[4693]: I1204 09:42:48.030095 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:48 crc kubenswrapper[4693]: I1204 09:42:48.030136 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:48 crc kubenswrapper[4693]: I1204 09:42:48.032666 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:42:49 crc kubenswrapper[4693]: I1204 09:42:49.031735 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:49 crc kubenswrapper[4693]: I1204 09:42:49.033098 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:49 crc kubenswrapper[4693]: I1204 09:42:49.033146 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:49 crc kubenswrapper[4693]: I1204 09:42:49.033158 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:49 crc kubenswrapper[4693]: I1204 09:42:49.285389 4693 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 04 09:42:50 crc kubenswrapper[4693]: I1204 09:42:50.034211 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:50 crc kubenswrapper[4693]: I1204 09:42:50.035032 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:50 crc kubenswrapper[4693]: I1204 09:42:50.035083 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:50 crc kubenswrapper[4693]: I1204 09:42:50.035101 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:50 crc kubenswrapper[4693]: I1204 09:42:50.105678 4693 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 04 09:42:50 crc kubenswrapper[4693]: I1204 09:42:50.105747 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 04 09:42:50 crc kubenswrapper[4693]: I1204 09:42:50.109386 4693 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 04 09:42:50 crc kubenswrapper[4693]: I1204 09:42:50.109442 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 04 09:42:54 crc kubenswrapper[4693]: I1204 09:42:54.201710 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:42:54 crc kubenswrapper[4693]: I1204 09:42:54.201894 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:54 crc kubenswrapper[4693]: I1204 09:42:54.203048 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:54 crc kubenswrapper[4693]: I1204 09:42:54.203087 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:54 crc kubenswrapper[4693]: I1204 09:42:54.203096 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:54 crc kubenswrapper[4693]: I1204 09:42:54.205995 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:42:54 crc kubenswrapper[4693]: I1204 09:42:54.382026 4693 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 09:42:54 crc kubenswrapper[4693]: I1204 09:42:54.382108 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 09:42:54 crc kubenswrapper[4693]: E1204 09:42:54.508786 4693 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.046624 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.047673 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.047758 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.047813 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.049720 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.049853 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.050568 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.050682 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.050775 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.060413 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.095348 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.096924 4693 trace.go:236] Trace[638090412]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 09:42:42.469) (total time: 12626ms): Dec 04 09:42:55 crc kubenswrapper[4693]: Trace[638090412]: ---"Objects listed" error: 12626ms (09:42:55.096) Dec 04 09:42:55 crc kubenswrapper[4693]: Trace[638090412]: [12.62688478s] [12.62688478s] END Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.096960 4693 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.097234 4693 trace.go:236] Trace[1435391972]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 09:42:41.215) (total time: 13881ms): Dec 04 09:42:55 crc kubenswrapper[4693]: Trace[1435391972]: ---"Objects listed" error: 13881ms (09:42:55.097) Dec 04 09:42:55 crc kubenswrapper[4693]: Trace[1435391972]: [13.88145958s] [13.88145958s] END Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.097244 4693 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.097841 4693 trace.go:236] Trace[2062612780]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 09:42:41.171) (total time: 13925ms): Dec 04 09:42:55 crc kubenswrapper[4693]: Trace[2062612780]: ---"Objects listed" error: 13925ms (09:42:55.097) Dec 04 09:42:55 crc kubenswrapper[4693]: Trace[2062612780]: [13.925857242s] [13.925857242s] END Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.097870 4693 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.098373 4693 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.098744 4693 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.100175 4693 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.107505 4693 trace.go:236] Trace[457829714]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 09:42:42.651) (total time: 12456ms): Dec 04 09:42:55 crc kubenswrapper[4693]: Trace[457829714]: ---"Objects listed" error: 12456ms (09:42:55.107) Dec 04 09:42:55 crc kubenswrapper[4693]: Trace[457829714]: [12.456413531s] [12.456413531s] END Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.107527 4693 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.116901 4693 csr.go:261] certificate signing request csr-47jkb is approved, waiting to be issued Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.122534 4693 csr.go:257] certificate signing request csr-47jkb is issued Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.278174 4693 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42396->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.278233 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42396->192.168.126.11:17697: read: connection reset by peer" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.278173 4693 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42386->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.278287 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42386->192.168.126.11:17697: read: connection reset by peer" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.278529 4693 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.278858 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.289724 4693 apiserver.go:52] "Watching apiserver" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.295815 4693 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.296058 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.296376 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.296644 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.296751 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.296793 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.296747 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.296942 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.297002 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.297192 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.297238 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.297876 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.298622 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.298944 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.299171 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.299311 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.299495 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.299987 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.301622 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.301988 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.326402 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.338549 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.351128 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.359562 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.367954 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.377676 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.386841 4693 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.391776 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.403126 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.403398 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.403483 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.403550 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.403614 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.403710 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.403793 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.403875 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.403954 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.404044 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.404117 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.404391 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.404478 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.404542 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.404605 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.404718 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.404797 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.404875 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.404955 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.404149 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405028 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.404512 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.404561 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.404557 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405148 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.404723 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.404741 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.404869 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.404876 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.404892 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405022 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405035 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405261 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405361 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405509 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405638 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405681 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405702 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405722 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405739 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405756 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405772 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405786 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405800 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405816 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405836 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405860 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405876 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405893 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405909 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405924 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405940 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405951 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.405957 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406031 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406055 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406076 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406097 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406110 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406119 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406140 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406159 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406178 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406197 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406218 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406240 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406261 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406281 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406301 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406319 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406362 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406385 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406406 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406428 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406453 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406474 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406495 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406516 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406538 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406562 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406585 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406606 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406623 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406639 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406655 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406670 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406687 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406727 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406742 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406758 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406774 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406789 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406813 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406856 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406875 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406892 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406910 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406932 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406968 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406987 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407012 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407036 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407056 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407074 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407096 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407119 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407142 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407380 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407403 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407420 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407435 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407450 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407471 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407493 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407517 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407533 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407572 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407594 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407616 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407636 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407654 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407676 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407696 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407721 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407741 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407767 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407790 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407812 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407836 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407859 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407883 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407905 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407928 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407952 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407975 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407999 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408021 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408047 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408068 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408092 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408114 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408135 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408157 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408179 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408203 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406221 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406372 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408282 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408307 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408348 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408376 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408398 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408420 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408444 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408469 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408492 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408514 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408538 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408575 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408600 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408625 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408648 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408671 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408696 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408718 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408739 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408764 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408786 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408808 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408830 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408853 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408876 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408899 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408922 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408945 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408969 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408996 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409020 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409046 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409069 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409094 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409118 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409143 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409165 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409190 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409215 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409238 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409262 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409285 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409309 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410054 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410087 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410112 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410141 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410164 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410188 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410210 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410235 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410259 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410285 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410311 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410355 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410381 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410405 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410440 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410464 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410488 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410513 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410538 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410561 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410587 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410609 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410631 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410653 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410676 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410725 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410757 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410782 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410809 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410845 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410878 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410906 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410934 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410963 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410986 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411013 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411041 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411065 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411094 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411168 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411186 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411200 4693 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411215 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411228 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411241 4693 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411253 4693 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411267 4693 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411281 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411295 4693 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411350 4693 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411368 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411382 4693 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411394 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411407 4693 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411419 4693 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411432 4693 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411444 4693 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411460 4693 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411473 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406654 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406662 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406803 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.406820 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407094 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407530 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407537 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407688 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407705 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407864 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.407986 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408001 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408132 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408265 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408267 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408304 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408593 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408616 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408717 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408744 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408872 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.408920 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409000 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409068 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409106 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409123 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409215 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409452 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409572 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409719 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409879 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.409881 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410036 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.415476 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410113 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410695 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.410815 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411691 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.411733 4693 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411864 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.411914 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.413030 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.413154 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.413176 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.413390 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.413549 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.413586 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.413613 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.414354 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.414420 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.414593 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.414619 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.414632 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.414835 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.414874 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.415042 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.415184 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:42:55.915164291 +0000 UTC m=+21.812758044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.415321 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.415792 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.415781 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.415912 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.415251 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.415527 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.415804 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.416067 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.416141 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.416318 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.416449 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.416508 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.416621 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.417424 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.417519 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.417645 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:42:55.917628627 +0000 UTC m=+21.815222380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.417663 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.415490 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.417727 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.418147 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.418273 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.418486 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.418666 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.420472 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.421243 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.421589 4693 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.421782 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.422169 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.422340 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.422552 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.422748 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.422915 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.422956 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.423207 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.423216 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.423431 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.423451 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.423591 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.423658 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.423718 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.423826 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.423983 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.424062 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.424090 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.425282 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.425487 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.425538 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.425600 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.425724 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.425809 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.425821 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.426063 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.424188 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.424443 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.426127 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.424582 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.424686 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.424896 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.424910 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.424971 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.425199 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.425222 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.424553 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.426480 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.426550 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.426596 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.426767 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.427198 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.427244 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.427313 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.427510 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.427683 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.427929 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.428070 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.428237 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.428441 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.429417 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.428518 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.428570 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.428232 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.428726 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.428742 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.429001 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.429018 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.429166 4693 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.429570 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:42:55.929554818 +0000 UTC m=+21.827148571 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.429183 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.429232 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.429237 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.429716 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.430618 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.430867 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.433634 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.436166 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.436311 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.436532 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.436531 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.438253 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.438279 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.438317 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.439533 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.439559 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.441117 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.442995 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.443257 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.443430 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.443683 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.443700 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.443719 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.443732 4693 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.443847 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 09:42:55.943828912 +0000 UTC m=+21.841422665 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.443959 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.446407 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.446967 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.448924 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.448953 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.448967 4693 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.449023 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 09:42:55.949004631 +0000 UTC m=+21.846598494 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.449453 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.451789 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.452163 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.453035 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.453359 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.457107 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.458616 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.459599 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.460099 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.460248 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.460723 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.461457 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.461496 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.462141 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.462250 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.462378 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.462493 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.463125 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.478176 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.488208 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.493621 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512073 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512128 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512186 4693 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512196 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512205 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512214 4693 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512223 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512231 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512239 4693 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512247 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512255 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512263 4693 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512270 4693 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512278 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512286 4693 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512294 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512302 4693 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512310 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512317 4693 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512326 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512350 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512357 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512359 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512365 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512408 4693 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512422 4693 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512434 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512446 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512457 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512467 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512476 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512487 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512497 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512507 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512518 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512530 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512542 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512552 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512563 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512573 4693 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512585 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512310 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512596 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512636 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512651 4693 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512663 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512675 4693 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512687 4693 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512698 4693 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512710 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512727 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512740 4693 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512751 4693 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512762 4693 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512772 4693 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512782 4693 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512794 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512805 4693 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512818 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512829 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512843 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512855 4693 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512866 4693 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512877 4693 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512888 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512898 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512909 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512920 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512930 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512941 4693 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512952 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512962 4693 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512974 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512984 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.512995 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513008 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513019 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513031 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513042 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513053 4693 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513065 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513082 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513093 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513105 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513117 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513130 4693 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513143 4693 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513292 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513310 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513324 4693 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513510 4693 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513525 4693 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513535 4693 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513546 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513558 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513568 4693 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513579 4693 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513590 4693 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513601 4693 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513612 4693 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513623 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513634 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513645 4693 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513657 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513669 4693 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513680 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513693 4693 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513705 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513716 4693 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513729 4693 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513740 4693 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513751 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513762 4693 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513776 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513788 4693 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513799 4693 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513813 4693 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513825 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513836 4693 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513848 4693 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513859 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513870 4693 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513882 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513893 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513905 4693 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513916 4693 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513927 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513939 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513949 4693 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513961 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513985 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.513997 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514013 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514025 4693 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514036 4693 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514048 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514059 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514070 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514082 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514093 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514104 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514116 4693 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514131 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514142 4693 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514153 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514164 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514175 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514188 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514200 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514211 4693 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514221 4693 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514232 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514244 4693 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514256 4693 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514267 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514366 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514381 4693 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514393 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514407 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514419 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514430 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514441 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514451 4693 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514463 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514478 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514490 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514501 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514511 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514522 4693 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514533 4693 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514544 4693 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514555 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514566 4693 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514577 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514588 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514598 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514609 4693 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.514620 4693 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.615779 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.627580 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.642524 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.918440 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.918630 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:42:56.918604372 +0000 UTC m=+22.816198125 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:42:55 crc kubenswrapper[4693]: I1204 09:42:55.918791 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.918885 4693 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:42:55 crc kubenswrapper[4693]: E1204 09:42:55.918950 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:42:56.918939411 +0000 UTC m=+22.816533164 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.019248 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.019299 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.019347 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:42:56 crc kubenswrapper[4693]: E1204 09:42:56.019427 4693 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:42:56 crc kubenswrapper[4693]: E1204 09:42:56.019448 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:42:56 crc kubenswrapper[4693]: E1204 09:42:56.019478 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:42:56 crc kubenswrapper[4693]: E1204 09:42:56.019483 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:42:57.019464287 +0000 UTC m=+22.917058050 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:42:56 crc kubenswrapper[4693]: E1204 09:42:56.019492 4693 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:42:56 crc kubenswrapper[4693]: E1204 09:42:56.019540 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 09:42:57.019523399 +0000 UTC m=+22.917117192 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:42:56 crc kubenswrapper[4693]: E1204 09:42:56.019577 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:42:56 crc kubenswrapper[4693]: E1204 09:42:56.019594 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:42:56 crc kubenswrapper[4693]: E1204 09:42:56.019606 4693 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:42:56 crc kubenswrapper[4693]: E1204 09:42:56.019639 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 09:42:57.019628942 +0000 UTC m=+22.917222775 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.049856 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"40eadc545e0687394c21f6e68ab569d23800a2e25a304bce1bc8d5d5f0823342"} Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.050798 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0f0740cf38a2c943fd62aacea436a736b3fcfd4954e2ee25f1dbef574c615469"} Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.052485 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.054068 4693 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ba70fba21706d5cd46c7264ae364f0e6d05bb6833ffa54262e88734a2f45cf81" exitCode=255 Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.054138 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ba70fba21706d5cd46c7264ae364f0e6d05bb6833ffa54262e88734a2f45cf81"} Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.055311 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f6b1fc22d638551f440f6cfb92a6b483e428c7033cacc7881125e54ada5ef0d8"} Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.078841 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.097269 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.120318 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.123964 4693 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-04 09:37:55 +0000 UTC, rotation deadline is 2026-10-02 10:32:08.476879363 +0000 UTC Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.124012 4693 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7248h49m12.352869812s for next certificate rotation Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.136568 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.144078 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.150981 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.225722 4693 scope.go:117] "RemoveContainer" containerID="ba70fba21706d5cd46c7264ae364f0e6d05bb6833ffa54262e88734a2f45cf81" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.225849 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.254151 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-p9sjs"] Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.254512 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p9sjs" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.259079 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.260062 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.260853 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.292585 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9sjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a277725-f562-43ef-86de-0b8d5d451cb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdt54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9sjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.321995 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9459fc-2791-47a8-8de4-2ed953140a1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8f9fc1cae80cd0ad4be20d8efc260c5e201546591544d8bc7db5af6e9f48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bdd206f1b3b49e2e20493e6447668c3de1675e8a8264a5d44cdda619862b81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fc5a089f9adfcecadb1c36dfb0e79952b135122b997f3d2caa36ff14c82d244\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba70fba21706d5cd46c7264ae364f0e6d05bb6833ffa54262e88734a2f45cf81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba70fba21706d5cd46c7264ae364f0e6d05bb6833ffa54262e88734a2f45cf81\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:42:49.884804 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:42:49.889960 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1164374132/tls.crt::/tmp/serving-cert-1164374132/tls.key\\\\\\\"\\\\nI1204 09:42:55.264588 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:42:55.266567 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:42:55.266586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:42:55.266605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:42:55.266610 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:42:55.272023 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:42:55.272047 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:42:55.272051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:42:55.272055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:42:55.272058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:42:55.272061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:42:55.272064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:42:55.272311 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:42:55.274194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e74641d799edb6a4f5e88804ac71bed808d23cfbacfc7a827a8ac3b3e35b910a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.322061 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9a277725-f562-43ef-86de-0b8d5d451cb9-hosts-file\") pod \"node-resolver-p9sjs\" (UID: \"9a277725-f562-43ef-86de-0b8d5d451cb9\") " pod="openshift-dns/node-resolver-p9sjs" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.322265 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdt54\" (UniqueName: \"kubernetes.io/projected/9a277725-f562-43ef-86de-0b8d5d451cb9-kube-api-access-cdt54\") pod \"node-resolver-p9sjs\" (UID: \"9a277725-f562-43ef-86de-0b8d5d451cb9\") " pod="openshift-dns/node-resolver-p9sjs" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.356376 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.382813 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.397193 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.406135 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.413317 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.422817 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.422928 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9a277725-f562-43ef-86de-0b8d5d451cb9-hosts-file\") pod \"node-resolver-p9sjs\" (UID: \"9a277725-f562-43ef-86de-0b8d5d451cb9\") " pod="openshift-dns/node-resolver-p9sjs" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.422982 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdt54\" (UniqueName: \"kubernetes.io/projected/9a277725-f562-43ef-86de-0b8d5d451cb9-kube-api-access-cdt54\") pod \"node-resolver-p9sjs\" (UID: \"9a277725-f562-43ef-86de-0b8d5d451cb9\") " pod="openshift-dns/node-resolver-p9sjs" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.423054 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9a277725-f562-43ef-86de-0b8d5d451cb9-hosts-file\") pod \"node-resolver-p9sjs\" (UID: \"9a277725-f562-43ef-86de-0b8d5d451cb9\") " pod="openshift-dns/node-resolver-p9sjs" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.438897 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdt54\" (UniqueName: \"kubernetes.io/projected/9a277725-f562-43ef-86de-0b8d5d451cb9-kube-api-access-cdt54\") pod \"node-resolver-p9sjs\" (UID: \"9a277725-f562-43ef-86de-0b8d5d451cb9\") " pod="openshift-dns/node-resolver-p9sjs" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.460668 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:42:56 crc kubenswrapper[4693]: E1204 09:42:56.460805 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.464727 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.465324 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.466104 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.466816 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.468313 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.468785 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.469410 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.470436 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.471177 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.472099 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.472713 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.473856 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.474429 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.475284 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.475902 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.476882 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.477450 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.477837 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.478770 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.479410 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.479900 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.481048 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.481472 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.482467 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.482919 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.484129 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.484897 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.485527 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.486564 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.487145 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.488253 4693 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.488455 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.491348 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.494256 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.494994 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.496789 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.497966 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.498747 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.499956 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.501620 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.502268 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.505636 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.506354 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.507128 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.507681 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.508220 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.508853 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.509916 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.510583 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.511162 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.511788 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.512451 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.513147 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.513722 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.552933 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.583723 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p9sjs" Dec 04 09:42:56 crc kubenswrapper[4693]: W1204 09:42:56.598213 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a277725_f562_43ef_86de_0b8d5d451cb9.slice/crio-07fc568a8490b53f4b87bc08e489fdf9277e12931629cfdf36a936680393f568 WatchSource:0}: Error finding container 07fc568a8490b53f4b87bc08e489fdf9277e12931629cfdf36a936680393f568: Status 404 returned error can't find the container with id 07fc568a8490b53f4b87bc08e489fdf9277e12931629cfdf36a936680393f568 Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.647072 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-zb44s"] Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.647325 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-sgn9x"] Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.647504 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.647545 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-cd545"] Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.648013 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.649545 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.649906 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.653009 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.653104 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.653226 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.653312 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.655092 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.655151 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.655107 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.655349 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.655434 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.655560 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.655748 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.673435 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"195d192a-cf00-4d97-aaa5-ba925d617ea4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f1d16b9cbed587bd5b012b61263fd66fadb83c402d7599916857619c2f47f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70d9c617b061628c7c2bf8978bbcb1210836830eb42c81c48de568bed9de7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9d956276512f47cd540d03a6966813b34274f8300b7b98657838c95ee2c2765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6904f66951f941f5374c6ea55f5c6e8df79aaf6f2c16f044891414ec5201c581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a25248fb13fc62e65feb0b2266b74937d697d8c4aa7b33b8ba88855e91ae15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b090dd2f29112051f6d01d73cb4a15b31c0ac5720b8e1ed25acf3cf1717391b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b090dd2f29112051f6d01d73cb4a15b31c0ac5720b8e1ed25acf3cf1717391b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5830ed4a7ecf954de74ae938ee105cac4514c0ecd06245f46f0e0eaebcb5b783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5830ed4a7ecf954de74ae938ee105cac4514c0ecd06245f46f0e0eaebcb5b783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://22325cd9dd4c02323f325a2a98a3e0158d717a457eb655fb162468532bd3a220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22325cd9dd4c02323f325a2a98a3e0158d717a457eb655fb162468532bd3a220\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.683827 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.694748 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.707042 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zb44s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwq2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zb44s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.717378 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9459fc-2791-47a8-8de4-2ed953140a1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8f9fc1cae80cd0ad4be20d8efc260c5e201546591544d8bc7db5af6e9f48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bdd206f1b3b49e2e20493e6447668c3de1675e8a8264a5d44cdda619862b81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fc5a089f9adfcecadb1c36dfb0e79952b135122b997f3d2caa36ff14c82d244\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba70fba21706d5cd46c7264ae364f0e6d05bb6833ffa54262e88734a2f45cf81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba70fba21706d5cd46c7264ae364f0e6d05bb6833ffa54262e88734a2f45cf81\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:42:49.884804 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:42:49.889960 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1164374132/tls.crt::/tmp/serving-cert-1164374132/tls.key\\\\\\\"\\\\nI1204 09:42:55.264588 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:42:55.266567 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:42:55.266586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:42:55.266605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:42:55.266610 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:42:55.272023 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:42:55.272047 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:42:55.272051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:42:55.272055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:42:55.272058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:42:55.272061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:42:55.272064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:42:55.272311 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:42:55.274194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e74641d799edb6a4f5e88804ac71bed808d23cfbacfc7a827a8ac3b3e35b910a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725128 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfhvk\" (UniqueName: \"kubernetes.io/projected/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-kube-api-access-lfhvk\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725159 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-host-run-multus-certs\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725180 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-hostroot\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725198 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d4f65408-7d18-47db-8a19-f9be435dd348-proxy-tls\") pod \"machine-config-daemon-sgn9x\" (UID: \"d4f65408-7d18-47db-8a19-f9be435dd348\") " pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725215 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-cnibin\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725231 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-multus-daemon-config\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725262 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-system-cni-dir\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725281 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-multus-cni-dir\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725296 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d4f65408-7d18-47db-8a19-f9be435dd348-mcd-auth-proxy-config\") pod \"machine-config-daemon-sgn9x\" (UID: \"d4f65408-7d18-47db-8a19-f9be435dd348\") " pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725312 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-system-cni-dir\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725341 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-multus-socket-dir-parent\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725409 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-host-var-lib-cni-multus\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725510 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-cni-binary-copy\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725546 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725563 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-host-var-lib-cni-bin\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725593 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-cnibin\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725609 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725626 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-multus-conf-dir\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725641 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-cni-binary-copy\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725657 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d4f65408-7d18-47db-8a19-f9be435dd348-rootfs\") pod \"machine-config-daemon-sgn9x\" (UID: \"d4f65408-7d18-47db-8a19-f9be435dd348\") " pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725674 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-etc-kubernetes\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725689 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-os-release\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725704 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-host-var-lib-kubelet\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725729 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pgjp\" (UniqueName: \"kubernetes.io/projected/d4f65408-7d18-47db-8a19-f9be435dd348-kube-api-access-5pgjp\") pod \"machine-config-daemon-sgn9x\" (UID: \"d4f65408-7d18-47db-8a19-f9be435dd348\") " pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725744 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-os-release\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725765 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-host-run-netns\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725807 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-host-run-k8s-cni-cncf-io\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.725821 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwq2z\" (UniqueName: \"kubernetes.io/projected/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-kube-api-access-mwq2z\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.728834 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.737778 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.747563 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.756732 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.764707 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9sjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a277725-f562-43ef-86de-0b8d5d451cb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdt54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9sjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.775972 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9459fc-2791-47a8-8de4-2ed953140a1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8f9fc1cae80cd0ad4be20d8efc260c5e201546591544d8bc7db5af6e9f48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bdd206f1b3b49e2e20493e6447668c3de1675e8a8264a5d44cdda619862b81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fc5a089f9adfcecadb1c36dfb0e79952b135122b997f3d2caa36ff14c82d244\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba70fba21706d5cd46c7264ae364f0e6d05bb6833ffa54262e88734a2f45cf81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba70fba21706d5cd46c7264ae364f0e6d05bb6833ffa54262e88734a2f45cf81\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:42:49.884804 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:42:49.889960 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1164374132/tls.crt::/tmp/serving-cert-1164374132/tls.key\\\\\\\"\\\\nI1204 09:42:55.264588 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:42:55.266567 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:42:55.266586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:42:55.266605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:42:55.266610 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:42:55.272023 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:42:55.272047 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:42:55.272051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:42:55.272055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:42:55.272058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:42:55.272061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:42:55.272064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:42:55.272311 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:42:55.274194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e74641d799edb6a4f5e88804ac71bed808d23cfbacfc7a827a8ac3b3e35b910a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.784790 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.797323 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.811032 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826155 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9sjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a277725-f562-43ef-86de-0b8d5d451cb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdt54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9sjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826283 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-os-release\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826322 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-host-run-netns\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826366 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-host-run-k8s-cni-cncf-io\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826421 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwq2z\" (UniqueName: \"kubernetes.io/projected/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-kube-api-access-mwq2z\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826426 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-host-run-netns\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826443 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfhvk\" (UniqueName: \"kubernetes.io/projected/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-kube-api-access-lfhvk\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826495 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-host-run-multus-certs\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826522 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d4f65408-7d18-47db-8a19-f9be435dd348-proxy-tls\") pod \"machine-config-daemon-sgn9x\" (UID: \"d4f65408-7d18-47db-8a19-f9be435dd348\") " pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826563 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-host-run-k8s-cni-cncf-io\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826576 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-cnibin\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826616 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-host-run-multus-certs\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826643 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-hostroot\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826676 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-hostroot\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826631 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-os-release\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826697 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-system-cni-dir\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826720 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-system-cni-dir\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826721 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-multus-cni-dir\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826752 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-multus-daemon-config\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826771 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-multus-cni-dir\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826775 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d4f65408-7d18-47db-8a19-f9be435dd348-mcd-auth-proxy-config\") pod \"machine-config-daemon-sgn9x\" (UID: \"d4f65408-7d18-47db-8a19-f9be435dd348\") " pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826796 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-system-cni-dir\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826816 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-multus-socket-dir-parent\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826850 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-cni-binary-copy\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826868 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826885 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-host-var-lib-cni-multus\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826915 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-host-var-lib-cni-bin\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826918 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-multus-socket-dir-parent\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826940 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-cnibin\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826959 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826976 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-multus-conf-dir\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826996 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-cni-binary-copy\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.826993 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-host-var-lib-cni-multus\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.827014 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-cnibin\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.827032 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-etc-kubernetes\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.827014 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-etc-kubernetes\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.827062 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d4f65408-7d18-47db-8a19-f9be435dd348-rootfs\") pod \"machine-config-daemon-sgn9x\" (UID: \"d4f65408-7d18-47db-8a19-f9be435dd348\") " pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.827063 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-multus-conf-dir\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.827079 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-host-var-lib-kubelet\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.827098 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-os-release\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.827114 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d4f65408-7d18-47db-8a19-f9be435dd348-rootfs\") pod \"machine-config-daemon-sgn9x\" (UID: \"d4f65408-7d18-47db-8a19-f9be435dd348\") " pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.827114 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-cnibin\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.827133 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pgjp\" (UniqueName: \"kubernetes.io/projected/d4f65408-7d18-47db-8a19-f9be435dd348-kube-api-access-5pgjp\") pod \"machine-config-daemon-sgn9x\" (UID: \"d4f65408-7d18-47db-8a19-f9be435dd348\") " pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.827127 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-host-var-lib-cni-bin\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.827145 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-host-var-lib-kubelet\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.827239 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-os-release\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.827556 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-multus-daemon-config\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.827636 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d4f65408-7d18-47db-8a19-f9be435dd348-mcd-auth-proxy-config\") pod \"machine-config-daemon-sgn9x\" (UID: \"d4f65408-7d18-47db-8a19-f9be435dd348\") " pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.827655 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-cni-binary-copy\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.827698 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-system-cni-dir\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.827712 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-cni-binary-copy\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.827714 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.845194 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"195d192a-cf00-4d97-aaa5-ba925d617ea4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f1d16b9cbed587bd5b012b61263fd66fadb83c402d7599916857619c2f47f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70d9c617b061628c7c2bf8978bbcb1210836830eb42c81c48de568bed9de7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9d956276512f47cd540d03a6966813b34274f8300b7b98657838c95ee2c2765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6904f66951f941f5374c6ea55f5c6e8df79aaf6f2c16f044891414ec5201c581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a25248fb13fc62e65feb0b2266b74937d697d8c4aa7b33b8ba88855e91ae15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b090dd2f29112051f6d01d73cb4a15b31c0ac5720b8e1ed25acf3cf1717391b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b090dd2f29112051f6d01d73cb4a15b31c0ac5720b8e1ed25acf3cf1717391b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5830ed4a7ecf954de74ae938ee105cac4514c0ecd06245f46f0e0eaebcb5b783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5830ed4a7ecf954de74ae938ee105cac4514c0ecd06245f46f0e0eaebcb5b783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://22325cd9dd4c02323f325a2a98a3e0158d717a457eb655fb162468532bd3a220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22325cd9dd4c02323f325a2a98a3e0158d717a457eb655fb162468532bd3a220\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.859682 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d4f65408-7d18-47db-8a19-f9be435dd348-proxy-tls\") pod \"machine-config-daemon-sgn9x\" (UID: \"d4f65408-7d18-47db-8a19-f9be435dd348\") " pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.860058 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pgjp\" (UniqueName: \"kubernetes.io/projected/d4f65408-7d18-47db-8a19-f9be435dd348-kube-api-access-5pgjp\") pod \"machine-config-daemon-sgn9x\" (UID: \"d4f65408-7d18-47db-8a19-f9be435dd348\") " pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.860066 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.860241 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwq2z\" (UniqueName: \"kubernetes.io/projected/2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4-kube-api-access-mwq2z\") pod \"multus-zb44s\" (UID: \"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\") " pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.860568 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfhvk\" (UniqueName: \"kubernetes.io/projected/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-kube-api-access-lfhvk\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.863129 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/24bfab1f-25f4-44b3-89cf-bb7fda55c2de-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cd545\" (UID: \"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\") " pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.868737 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.878967 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zb44s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwq2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zb44s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.887006 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.896481 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f65408-7d18-47db-8a19-f9be435dd348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pgjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pgjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgn9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.913499 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cd545" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cd545\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.927937 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.928110 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:42:56 crc kubenswrapper[4693]: E1204 09:42:56.928157 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:42:58.928123336 +0000 UTC m=+24.825717089 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:42:56 crc kubenswrapper[4693]: E1204 09:42:56.928254 4693 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:42:56 crc kubenswrapper[4693]: E1204 09:42:56.928360 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:42:58.928317252 +0000 UTC m=+24.825911005 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.967283 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zb44s" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.975709 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 09:42:56 crc kubenswrapper[4693]: I1204 09:42:56.985081 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cd545" Dec 04 09:42:56 crc kubenswrapper[4693]: W1204 09:42:56.985683 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4f65408_7d18_47db_8a19_f9be435dd348.slice/crio-5899c2d4f78e3ca1c7ffb725c017f347c5d1acabc72258949b81107f37cd276c WatchSource:0}: Error finding container 5899c2d4f78e3ca1c7ffb725c017f347c5d1acabc72258949b81107f37cd276c: Status 404 returned error can't find the container with id 5899c2d4f78e3ca1c7ffb725c017f347c5d1acabc72258949b81107f37cd276c Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.015828 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wm5mt"] Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.016577 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.019436 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.019472 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.020126 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.020140 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.020367 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.021627 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.021730 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.028734 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.028910 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.029031 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:42:57 crc kubenswrapper[4693]: E1204 09:42:57.028977 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:42:57 crc kubenswrapper[4693]: E1204 09:42:57.029246 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:42:57 crc kubenswrapper[4693]: E1204 09:42:57.029373 4693 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:42:57 crc kubenswrapper[4693]: E1204 09:42:57.029055 4693 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:42:57 crc kubenswrapper[4693]: E1204 09:42:57.029119 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:42:57 crc kubenswrapper[4693]: E1204 09:42:57.029633 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:42:57 crc kubenswrapper[4693]: E1204 09:42:57.029648 4693 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:42:57 crc kubenswrapper[4693]: E1204 09:42:57.029542 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 09:42:59.029521216 +0000 UTC m=+24.927114969 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:42:57 crc kubenswrapper[4693]: E1204 09:42:57.029704 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:42:59.0296833 +0000 UTC m=+24.927277133 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:42:57 crc kubenswrapper[4693]: E1204 09:42:57.029722 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 09:42:59.029714231 +0000 UTC m=+24.927308074 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.033620 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.042503 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.049717 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9sjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a277725-f562-43ef-86de-0b8d5d451cb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdt54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9sjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.058464 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zb44s" event={"ID":"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4","Type":"ContainerStarted","Data":"822f34c06c079a23a227eca119c93a9553b207ce6904c7663f3c9139a8e38fbe"} Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.059316 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"5899c2d4f78e3ca1c7ffb725c017f347c5d1acabc72258949b81107f37cd276c"} Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.060558 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9818a191a28ae61a76d6ae688a9413b3b51d8fafb10a9f77ab3e5888391f1846"} Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.062149 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p9sjs" event={"ID":"9a277725-f562-43ef-86de-0b8d5d451cb9","Type":"ContainerStarted","Data":"07fc568a8490b53f4b87bc08e489fdf9277e12931629cfdf36a936680393f568"} Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.062463 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9459fc-2791-47a8-8de4-2ed953140a1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8f9fc1cae80cd0ad4be20d8efc260c5e201546591544d8bc7db5af6e9f48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bdd206f1b3b49e2e20493e6447668c3de1675e8a8264a5d44cdda619862b81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fc5a089f9adfcecadb1c36dfb0e79952b135122b997f3d2caa36ff14c82d244\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba70fba21706d5cd46c7264ae364f0e6d05bb6833ffa54262e88734a2f45cf81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba70fba21706d5cd46c7264ae364f0e6d05bb6833ffa54262e88734a2f45cf81\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:42:49.884804 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:42:49.889960 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1164374132/tls.crt::/tmp/serving-cert-1164374132/tls.key\\\\\\\"\\\\nI1204 09:42:55.264588 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:42:55.266567 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:42:55.266586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:42:55.266605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:42:55.266610 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:42:55.272023 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:42:55.272047 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:42:55.272051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:42:55.272055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:42:55.272058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:42:55.272061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:42:55.272064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:42:55.272311 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:42:55.274194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e74641d799edb6a4f5e88804ac71bed808d23cfbacfc7a827a8ac3b3e35b910a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.075320 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.088554 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.101117 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.122072 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zb44s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwq2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zb44s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.129460 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-kubelet\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.129598 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-run-netns\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.129672 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.129740 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d6e969b8-31f1-4fbf-9597-16349612e0c0-ovnkube-script-lib\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.129826 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-run-openvswitch\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.129921 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-run-ovn\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.129989 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-etc-openvswitch\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.130055 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d6e969b8-31f1-4fbf-9597-16349612e0c0-ovnkube-config\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.130126 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-systemd-units\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.130206 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-cni-bin\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.130294 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d6e969b8-31f1-4fbf-9597-16349612e0c0-ovn-node-metrics-cert\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.130383 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-var-lib-openvswitch\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.130448 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-cni-netd\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.130518 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-run-systemd\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.130597 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-node-log\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.130664 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d6e969b8-31f1-4fbf-9597-16349612e0c0-env-overrides\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.130725 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-run-ovn-kubernetes\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.130805 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4tdx\" (UniqueName: \"kubernetes.io/projected/d6e969b8-31f1-4fbf-9597-16349612e0c0-kube-api-access-x4tdx\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.130856 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-slash\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.130873 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-log-socket\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.158389 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"195d192a-cf00-4d97-aaa5-ba925d617ea4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f1d16b9cbed587bd5b012b61263fd66fadb83c402d7599916857619c2f47f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70d9c617b061628c7c2bf8978bbcb1210836830eb42c81c48de568bed9de7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9d956276512f47cd540d03a6966813b34274f8300b7b98657838c95ee2c2765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6904f66951f941f5374c6ea55f5c6e8df79aaf6f2c16f044891414ec5201c581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a25248fb13fc62e65feb0b2266b74937d697d8c4aa7b33b8ba88855e91ae15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b090dd2f29112051f6d01d73cb4a15b31c0ac5720b8e1ed25acf3cf1717391b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b090dd2f29112051f6d01d73cb4a15b31c0ac5720b8e1ed25acf3cf1717391b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5830ed4a7ecf954de74ae938ee105cac4514c0ecd06245f46f0e0eaebcb5b783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5830ed4a7ecf954de74ae938ee105cac4514c0ecd06245f46f0e0eaebcb5b783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://22325cd9dd4c02323f325a2a98a3e0158d717a457eb655fb162468532bd3a220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22325cd9dd4c02323f325a2a98a3e0158d717a457eb655fb162468532bd3a220\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.183359 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.222800 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6e969b8-31f1-4fbf-9597-16349612e0c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wm5mt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.230862 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f65408-7d18-47db-8a19-f9be435dd348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pgjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pgjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgn9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.231524 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-run-openvswitch\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.231629 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d6e969b8-31f1-4fbf-9597-16349612e0c0-ovnkube-script-lib\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.231706 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-run-ovn\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.231768 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d6e969b8-31f1-4fbf-9597-16349612e0c0-ovnkube-config\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.231830 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-etc-openvswitch\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.231892 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-cni-bin\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232050 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-systemd-units\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232120 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-run-ovn\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232121 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d6e969b8-31f1-4fbf-9597-16349612e0c0-ovn-node-metrics-cert\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232200 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-var-lib-openvswitch\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232219 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-cni-netd\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232253 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-run-systemd\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232272 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-node-log\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232292 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d6e969b8-31f1-4fbf-9597-16349612e0c0-env-overrides\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232318 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-slash\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232350 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-log-socket\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232366 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-run-ovn-kubernetes\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232381 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4tdx\" (UniqueName: \"kubernetes.io/projected/d6e969b8-31f1-4fbf-9597-16349612e0c0-kube-api-access-x4tdx\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232405 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-kubelet\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232436 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-run-netns\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232453 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232486 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d6e969b8-31f1-4fbf-9597-16349612e0c0-ovnkube-script-lib\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232504 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232539 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-run-openvswitch\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232582 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-systemd-units\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.232090 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-cni-bin\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.233026 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d6e969b8-31f1-4fbf-9597-16349612e0c0-env-overrides\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.233049 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d6e969b8-31f1-4fbf-9597-16349612e0c0-ovnkube-config\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.233068 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-etc-openvswitch\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.233086 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-slash\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.233093 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-var-lib-openvswitch\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.233109 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-log-socket\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.233118 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-cni-netd\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.233131 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-run-ovn-kubernetes\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.233141 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-run-systemd\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.233165 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-node-log\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.233190 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-kubelet\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.233212 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-run-netns\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.237248 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d6e969b8-31f1-4fbf-9597-16349612e0c0-ovn-node-metrics-cert\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.241775 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cd545" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cd545\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.246489 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4tdx\" (UniqueName: \"kubernetes.io/projected/d6e969b8-31f1-4fbf-9597-16349612e0c0-kube-api-access-x4tdx\") pod \"ovnkube-node-wm5mt\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.328033 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.460476 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:42:57 crc kubenswrapper[4693]: E1204 09:42:57.460586 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:42:57 crc kubenswrapper[4693]: I1204 09:42:57.460656 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:42:57 crc kubenswrapper[4693]: E1204 09:42:57.460798 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:42:58 crc kubenswrapper[4693]: I1204 09:42:58.069104 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 04 09:42:58 crc kubenswrapper[4693]: I1204 09:42:58.071468 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5830ae0efa54f837c13ae098b4dac7e3c73250f55e2641e916dbb30c89c829b5"} Dec 04 09:42:58 crc kubenswrapper[4693]: I1204 09:42:58.072620 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e281c95c398f539bb0e30e6ee10056fee92df116cba6037e1a9bb1213bc16bbf"} Dec 04 09:42:58 crc kubenswrapper[4693]: I1204 09:42:58.073297 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" event={"ID":"d6e969b8-31f1-4fbf-9597-16349612e0c0","Type":"ContainerStarted","Data":"704672773ab4756574dd3547af6d35bee3707bdfc1c172c7674bcd936e9beaf8"} Dec 04 09:42:58 crc kubenswrapper[4693]: I1204 09:42:58.074020 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cd545" event={"ID":"24bfab1f-25f4-44b3-89cf-bb7fda55c2de","Type":"ContainerStarted","Data":"407176432e65a077726c7befc10cff56a6b2c80093c4c2ad1708282e11d27063"} Dec 04 09:42:58 crc kubenswrapper[4693]: I1204 09:42:58.461002 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:42:58 crc kubenswrapper[4693]: E1204 09:42:58.461150 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:42:58 crc kubenswrapper[4693]: I1204 09:42:58.951077 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:42:58 crc kubenswrapper[4693]: E1204 09:42:58.951348 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:02.951316316 +0000 UTC m=+28.848910069 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:42:58 crc kubenswrapper[4693]: I1204 09:42:58.951569 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:42:58 crc kubenswrapper[4693]: E1204 09:42:58.951696 4693 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:42:58 crc kubenswrapper[4693]: E1204 09:42:58.951734 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:02.951726148 +0000 UTC m=+28.849319901 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.052389 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.052443 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.052484 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:42:59 crc kubenswrapper[4693]: E1204 09:42:59.052626 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:42:59 crc kubenswrapper[4693]: E1204 09:42:59.052647 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:42:59 crc kubenswrapper[4693]: E1204 09:42:59.052659 4693 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:42:59 crc kubenswrapper[4693]: E1204 09:42:59.052706 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:03.052689535 +0000 UTC m=+28.950283288 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:42:59 crc kubenswrapper[4693]: E1204 09:42:59.053310 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:42:59 crc kubenswrapper[4693]: E1204 09:42:59.053455 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:42:59 crc kubenswrapper[4693]: E1204 09:42:59.053472 4693 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:42:59 crc kubenswrapper[4693]: E1204 09:42:59.053422 4693 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:42:59 crc kubenswrapper[4693]: E1204 09:42:59.053539 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:03.053517397 +0000 UTC m=+28.951111200 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:42:59 crc kubenswrapper[4693]: E1204 09:42:59.053572 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:03.053556588 +0000 UTC m=+28.951150401 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.078175 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p9sjs" event={"ID":"9a277725-f562-43ef-86de-0b8d5d451cb9","Type":"ContainerStarted","Data":"da03c53587acf92bf0451d3cc980e4d0761368a6f6a3bd9ef0142e7fdd41ad56"} Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.079755 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4bdef5989a7b560183248925952a97f25e9468359c7c13e5b223f6bbf1c21b02"} Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.082150 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"1c8b29d44093cea9edc7ab30906ea2f3703a929324d93e24b6e56838a499e502"} Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.082203 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"dc3eb2c670d736e622f5e389adf244a416496825b95513b2ffb4548e91e3ce8c"} Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.083674 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zb44s" event={"ID":"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4","Type":"ContainerStarted","Data":"0e15b6f8f7d5c95ccd143a1d4a5f1a03a398802a52b93e7061c9c4d7e0318cc0"} Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.085204 4693 generic.go:334] "Generic (PLEG): container finished" podID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerID="d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d" exitCode=0 Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.085259 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" event={"ID":"d6e969b8-31f1-4fbf-9597-16349612e0c0","Type":"ContainerDied","Data":"d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d"} Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.088010 4693 generic.go:334] "Generic (PLEG): container finished" podID="24bfab1f-25f4-44b3-89cf-bb7fda55c2de" containerID="3d45fd220ed28f2eb688145f8fdc2bc8091d856cb6637a9e52eaa1cf71940e66" exitCode=0 Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.088114 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cd545" event={"ID":"24bfab1f-25f4-44b3-89cf-bb7fda55c2de","Type":"ContainerDied","Data":"3d45fd220ed28f2eb688145f8fdc2bc8091d856cb6637a9e52eaa1cf71940e66"} Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.093630 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b35a15e5c5a95b7154ed647d3117e8c08e11029458ef6de49b946ce901f23b1d"} Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.094267 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.108363 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f65408-7d18-47db-8a19-f9be435dd348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pgjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pgjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgn9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.128633 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cd545" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cd545\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.142623 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9459fc-2791-47a8-8de4-2ed953140a1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8f9fc1cae80cd0ad4be20d8efc260c5e201546591544d8bc7db5af6e9f48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bdd206f1b3b49e2e20493e6447668c3de1675e8a8264a5d44cdda619862b81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fc5a089f9adfcecadb1c36dfb0e79952b135122b997f3d2caa36ff14c82d244\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba70fba21706d5cd46c7264ae364f0e6d05bb6833ffa54262e88734a2f45cf81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba70fba21706d5cd46c7264ae364f0e6d05bb6833ffa54262e88734a2f45cf81\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:42:49.884804 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:42:49.889960 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1164374132/tls.crt::/tmp/serving-cert-1164374132/tls.key\\\\\\\"\\\\nI1204 09:42:55.264588 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:42:55.266567 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:42:55.266586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:42:55.266605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:42:55.266610 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:42:55.272023 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:42:55.272047 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:42:55.272051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:42:55.272055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:42:55.272058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:42:55.272061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:42:55.272064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:42:55.272311 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:42:55.274194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e74641d799edb6a4f5e88804ac71bed808d23cfbacfc7a827a8ac3b3e35b910a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.156118 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.167984 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.184060 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.189658 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-dlkvr"] Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.190205 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dlkvr" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.192615 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.192867 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.193736 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.194060 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.195033 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9sjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a277725-f562-43ef-86de-0b8d5d451cb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da03c53587acf92bf0451d3cc980e4d0761368a6f6a3bd9ef0142e7fdd41ad56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdt54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9sjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.221848 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"195d192a-cf00-4d97-aaa5-ba925d617ea4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f1d16b9cbed587bd5b012b61263fd66fadb83c402d7599916857619c2f47f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70d9c617b061628c7c2bf8978bbcb1210836830eb42c81c48de568bed9de7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9d956276512f47cd540d03a6966813b34274f8300b7b98657838c95ee2c2765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6904f66951f941f5374c6ea55f5c6e8df79aaf6f2c16f044891414ec5201c581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a25248fb13fc62e65feb0b2266b74937d697d8c4aa7b33b8ba88855e91ae15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b090dd2f29112051f6d01d73cb4a15b31c0ac5720b8e1ed25acf3cf1717391b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b090dd2f29112051f6d01d73cb4a15b31c0ac5720b8e1ed25acf3cf1717391b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5830ed4a7ecf954de74ae938ee105cac4514c0ecd06245f46f0e0eaebcb5b783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5830ed4a7ecf954de74ae938ee105cac4514c0ecd06245f46f0e0eaebcb5b783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://22325cd9dd4c02323f325a2a98a3e0158d717a457eb655fb162468532bd3a220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22325cd9dd4c02323f325a2a98a3e0158d717a457eb655fb162468532bd3a220\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.237070 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.250724 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.255490 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64cf8f2a-747a-482b-81f1-7fb9d9638481-host\") pod \"node-ca-dlkvr\" (UID: \"64cf8f2a-747a-482b-81f1-7fb9d9638481\") " pod="openshift-image-registry/node-ca-dlkvr" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.255545 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz9xd\" (UniqueName: \"kubernetes.io/projected/64cf8f2a-747a-482b-81f1-7fb9d9638481-kube-api-access-hz9xd\") pod \"node-ca-dlkvr\" (UID: \"64cf8f2a-747a-482b-81f1-7fb9d9638481\") " pod="openshift-image-registry/node-ca-dlkvr" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.255570 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/64cf8f2a-747a-482b-81f1-7fb9d9638481-serviceca\") pod \"node-ca-dlkvr\" (UID: \"64cf8f2a-747a-482b-81f1-7fb9d9638481\") " pod="openshift-image-registry/node-ca-dlkvr" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.263742 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zb44s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwq2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zb44s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.276759 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.301557 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6e969b8-31f1-4fbf-9597-16349612e0c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:57Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wm5mt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.331931 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e281c95c398f539bb0e30e6ee10056fee92df116cba6037e1a9bb1213bc16bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.356241 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64cf8f2a-747a-482b-81f1-7fb9d9638481-host\") pod \"node-ca-dlkvr\" (UID: \"64cf8f2a-747a-482b-81f1-7fb9d9638481\") " pod="openshift-image-registry/node-ca-dlkvr" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.356301 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz9xd\" (UniqueName: \"kubernetes.io/projected/64cf8f2a-747a-482b-81f1-7fb9d9638481-kube-api-access-hz9xd\") pod \"node-ca-dlkvr\" (UID: \"64cf8f2a-747a-482b-81f1-7fb9d9638481\") " pod="openshift-image-registry/node-ca-dlkvr" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.356367 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/64cf8f2a-747a-482b-81f1-7fb9d9638481-serviceca\") pod \"node-ca-dlkvr\" (UID: \"64cf8f2a-747a-482b-81f1-7fb9d9638481\") " pod="openshift-image-registry/node-ca-dlkvr" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.356416 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64cf8f2a-747a-482b-81f1-7fb9d9638481-host\") pod \"node-ca-dlkvr\" (UID: \"64cf8f2a-747a-482b-81f1-7fb9d9638481\") " pod="openshift-image-registry/node-ca-dlkvr" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.357659 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/64cf8f2a-747a-482b-81f1-7fb9d9638481-serviceca\") pod \"node-ca-dlkvr\" (UID: \"64cf8f2a-747a-482b-81f1-7fb9d9638481\") " pod="openshift-image-registry/node-ca-dlkvr" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.361452 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6e969b8-31f1-4fbf-9597-16349612e0c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wm5mt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.373223 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dlkvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64cf8f2a-747a-482b-81f1-7fb9d9638481\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hz9xd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dlkvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.379421 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz9xd\" (UniqueName: \"kubernetes.io/projected/64cf8f2a-747a-482b-81f1-7fb9d9638481-kube-api-access-hz9xd\") pod \"node-ca-dlkvr\" (UID: \"64cf8f2a-747a-482b-81f1-7fb9d9638481\") " pod="openshift-image-registry/node-ca-dlkvr" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.389405 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f65408-7d18-47db-8a19-f9be435dd348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c8b29d44093cea9edc7ab30906ea2f3703a929324d93e24b6e56838a499e502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pgjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2c670d736e622f5e389adf244a416496825b95513b2ffb4548e91e3ce8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pgjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgn9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.410512 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cd545" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d45fd220ed28f2eb688145f8fdc2bc8091d856cb6637a9e52eaa1cf71940e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d45fd220ed28f2eb688145f8fdc2bc8091d856cb6637a9e52eaa1cf71940e66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cd545\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.423123 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9459fc-2791-47a8-8de4-2ed953140a1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8f9fc1cae80cd0ad4be20d8efc260c5e201546591544d8bc7db5af6e9f48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bdd206f1b3b49e2e20493e6447668c3de1675e8a8264a5d44cdda619862b81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fc5a089f9adfcecadb1c36dfb0e79952b135122b997f3d2caa36ff14c82d244\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5830ae0efa54f837c13ae098b4dac7e3c73250f55e2641e916dbb30c89c829b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba70fba21706d5cd46c7264ae364f0e6d05bb6833ffa54262e88734a2f45cf81\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:42:49.884804 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:42:49.889960 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1164374132/tls.crt::/tmp/serving-cert-1164374132/tls.key\\\\\\\"\\\\nI1204 09:42:55.264588 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:42:55.266567 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:42:55.266586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:42:55.266605 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:42:55.266610 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:42:55.272023 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:42:55.272047 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:42:55.272051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:42:55.272055 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:42:55.272058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:42:55.272061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:42:55.272064 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:42:55.272311 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:42:55.274194 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e74641d799edb6a4f5e88804ac71bed808d23cfbacfc7a827a8ac3b3e35b910a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.435143 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35a15e5c5a95b7154ed647d3117e8c08e11029458ef6de49b946ce901f23b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9818a191a28ae61a76d6ae688a9413b3b51d8fafb10a9f77ab3e5888391f1846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.445474 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdef5989a7b560183248925952a97f25e9468359c7c13e5b223f6bbf1c21b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.457559 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.460705 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.460765 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:42:59 crc kubenswrapper[4693]: E1204 09:42:59.460796 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:42:59 crc kubenswrapper[4693]: E1204 09:42:59.460899 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.469052 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9sjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a277725-f562-43ef-86de-0b8d5d451cb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da03c53587acf92bf0451d3cc980e4d0761368a6f6a3bd9ef0142e7fdd41ad56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdt54\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9sjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.489240 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"195d192a-cf00-4d97-aaa5-ba925d617ea4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9f1d16b9cbed587bd5b012b61263fd66fadb83c402d7599916857619c2f47f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70d9c617b061628c7c2bf8978bbcb1210836830eb42c81c48de568bed9de7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9d956276512f47cd540d03a6966813b34274f8300b7b98657838c95ee2c2765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6904f66951f941f5374c6ea55f5c6e8df79aaf6f2c16f044891414ec5201c581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01a25248fb13fc62e65feb0b2266b74937d697d8c4aa7b33b8ba88855e91ae15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b090dd2f29112051f6d01d73cb4a15b31c0ac5720b8e1ed25acf3cf1717391b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b090dd2f29112051f6d01d73cb4a15b31c0ac5720b8e1ed25acf3cf1717391b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5830ed4a7ecf954de74ae938ee105cac4514c0ecd06245f46f0e0eaebcb5b783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5830ed4a7ecf954de74ae938ee105cac4514c0ecd06245f46f0e0eaebcb5b783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://22325cd9dd4c02323f325a2a98a3e0158d717a457eb655fb162468532bd3a220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22325cd9dd4c02323f325a2a98a3e0158d717a457eb655fb162468532bd3a220\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.502479 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.514181 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.514896 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dlkvr" Dec 04 09:42:59 crc kubenswrapper[4693]: W1204 09:42:59.528420 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64cf8f2a_747a_482b_81f1_7fb9d9638481.slice/crio-2c362f6310cb1c3530011592813b4312a7fe652dffa6c9d46997d50528b4a141 WatchSource:0}: Error finding container 2c362f6310cb1c3530011592813b4312a7fe652dffa6c9d46997d50528b4a141: Status 404 returned error can't find the container with id 2c362f6310cb1c3530011592813b4312a7fe652dffa6c9d46997d50528b4a141 Dec 04 09:42:59 crc kubenswrapper[4693]: I1204 09:42:59.530961 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zb44s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e15b6f8f7d5c95ccd143a1d4a5f1a03a398802a52b93e7061c9c4d7e0318cc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwq2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zb44s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:42:59Z is after 2025-08-24T17:21:41Z" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.095310 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" event={"ID":"d6e969b8-31f1-4fbf-9597-16349612e0c0","Type":"ContainerStarted","Data":"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82"} Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.095658 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" event={"ID":"d6e969b8-31f1-4fbf-9597-16349612e0c0","Type":"ContainerStarted","Data":"1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb"} Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.095672 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" event={"ID":"d6e969b8-31f1-4fbf-9597-16349612e0c0","Type":"ContainerStarted","Data":"c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1"} Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.096448 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dlkvr" event={"ID":"64cf8f2a-747a-482b-81f1-7fb9d9638481","Type":"ContainerStarted","Data":"2d00a7d156e9d9f0380fea24e2c8da4b96d5967ee32bdb5b8c08366887b2ef4a"} Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.096476 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dlkvr" event={"ID":"64cf8f2a-747a-482b-81f1-7fb9d9638481","Type":"ContainerStarted","Data":"2c362f6310cb1c3530011592813b4312a7fe652dffa6c9d46997d50528b4a141"} Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.097877 4693 generic.go:334] "Generic (PLEG): container finished" podID="24bfab1f-25f4-44b3-89cf-bb7fda55c2de" containerID="605d187b139d25ca2caa28a5875d0d1dfffcb6d02b6da6cceec6cbacb4ebb00c" exitCode=0 Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.098002 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cd545" event={"ID":"24bfab1f-25f4-44b3-89cf-bb7fda55c2de","Type":"ContainerDied","Data":"605d187b139d25ca2caa28a5875d0d1dfffcb6d02b6da6cceec6cbacb4ebb00c"} Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.120610 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6e969b8-31f1-4fbf-9597-16349612e0c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:57Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4tdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wm5mt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:43:00Z is after 2025-08-24T17:21:41Z" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.128899 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dlkvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64cf8f2a-747a-482b-81f1-7fb9d9638481\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:43:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d00a7d156e9d9f0380fea24e2c8da4b96d5967ee32bdb5b8c08366887b2ef4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hz9xd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dlkvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:43:00Z is after 2025-08-24T17:21:41Z" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.141406 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e281c95c398f539bb0e30e6ee10056fee92df116cba6037e1a9bb1213bc16bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:43:00Z is after 2025-08-24T17:21:41Z" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.151731 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f65408-7d18-47db-8a19-f9be435dd348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c8b29d44093cea9edc7ab30906ea2f3703a929324d93e24b6e56838a499e502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pgjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3eb2c670d736e622f5e389adf244a416496825b95513b2ffb4548e91e3ce8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pgjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sgn9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:43:00Z is after 2025-08-24T17:21:41Z" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.163845 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cd545" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24bfab1f-25f4-44b3-89cf-bb7fda55c2de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:42:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d45fd220ed28f2eb688145f8fdc2bc8091d856cb6637a9e52eaa1cf71940e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d45fd220ed28f2eb688145f8fdc2bc8091d856cb6637a9e52eaa1cf71940e66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:42:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:42:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfhvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:42:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cd545\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:43:00Z is after 2025-08-24T17:21:41Z" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.223591 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-p9sjs" podStartSLOduration=5.223552591 podStartE2EDuration="5.223552591s" podCreationTimestamp="2025-12-04 09:42:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:00.223201632 +0000 UTC m=+26.120795385" watchObservedRunningTime="2025-12-04 09:43:00.223552591 +0000 UTC m=+26.121146344" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.269190 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=4.269141988 podStartE2EDuration="4.269141988s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:00.267115653 +0000 UTC m=+26.164709406" watchObservedRunningTime="2025-12-04 09:43:00.269141988 +0000 UTC m=+26.166735741" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.269707 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=4.269698073 podStartE2EDuration="4.269698073s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:00.23835074 +0000 UTC m=+26.135944513" watchObservedRunningTime="2025-12-04 09:43:00.269698073 +0000 UTC m=+26.167291826" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.315449 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zb44s" podStartSLOduration=4.315433475 podStartE2EDuration="4.315433475s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:00.314984322 +0000 UTC m=+26.212578075" watchObservedRunningTime="2025-12-04 09:43:00.315433475 +0000 UTC m=+26.213027218" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.380185 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podStartSLOduration=4.380165407 podStartE2EDuration="4.380165407s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:00.37991424 +0000 UTC m=+26.277508003" watchObservedRunningTime="2025-12-04 09:43:00.380165407 +0000 UTC m=+26.277759200" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.380302 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-dlkvr" podStartSLOduration=5.38029652 podStartE2EDuration="5.38029652s" podCreationTimestamp="2025-12-04 09:42:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:00.366844868 +0000 UTC m=+26.264438621" watchObservedRunningTime="2025-12-04 09:43:00.38029652 +0000 UTC m=+26.277890273" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.460436 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:00 crc kubenswrapper[4693]: E1204 09:43:00.460794 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.514972 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77nf7"] Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.515349 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77nf7" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.517102 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.517263 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.560920 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-kncc4"] Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.561356 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kncc4" Dec 04 09:43:00 crc kubenswrapper[4693]: E1204 09:43:00.561419 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kncc4" podUID="6954da61-bafb-4b35-aa61-0f120c34c747" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.569027 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-77nf7\" (UID: \"5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77nf7" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.569072 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-77nf7\" (UID: \"5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77nf7" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.569115 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbrr6\" (UniqueName: \"kubernetes.io/projected/5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca-kube-api-access-rbrr6\") pod \"ovnkube-control-plane-749d76644c-77nf7\" (UID: \"5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77nf7" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.569173 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca-env-overrides\") pod \"ovnkube-control-plane-749d76644c-77nf7\" (UID: \"5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77nf7" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.670242 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xctsz\" (UniqueName: \"kubernetes.io/projected/6954da61-bafb-4b35-aa61-0f120c34c747-kube-api-access-xctsz\") pod \"network-metrics-daemon-kncc4\" (UID: \"6954da61-bafb-4b35-aa61-0f120c34c747\") " pod="openshift-multus/network-metrics-daemon-kncc4" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.670290 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca-env-overrides\") pod \"ovnkube-control-plane-749d76644c-77nf7\" (UID: \"5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77nf7" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.670351 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-77nf7\" (UID: \"5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77nf7" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.670372 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-77nf7\" (UID: \"5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77nf7" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.670389 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbrr6\" (UniqueName: \"kubernetes.io/projected/5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca-kube-api-access-rbrr6\") pod \"ovnkube-control-plane-749d76644c-77nf7\" (UID: \"5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77nf7" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.670410 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6954da61-bafb-4b35-aa61-0f120c34c747-metrics-certs\") pod \"network-metrics-daemon-kncc4\" (UID: \"6954da61-bafb-4b35-aa61-0f120c34c747\") " pod="openshift-multus/network-metrics-daemon-kncc4" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.670974 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca-env-overrides\") pod \"ovnkube-control-plane-749d76644c-77nf7\" (UID: \"5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77nf7" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.671032 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-77nf7\" (UID: \"5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77nf7" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.676161 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-77nf7\" (UID: \"5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77nf7" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.687024 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbrr6\" (UniqueName: \"kubernetes.io/projected/5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca-kube-api-access-rbrr6\") pod \"ovnkube-control-plane-749d76644c-77nf7\" (UID: \"5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77nf7" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.771172 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xctsz\" (UniqueName: \"kubernetes.io/projected/6954da61-bafb-4b35-aa61-0f120c34c747-kube-api-access-xctsz\") pod \"network-metrics-daemon-kncc4\" (UID: \"6954da61-bafb-4b35-aa61-0f120c34c747\") " pod="openshift-multus/network-metrics-daemon-kncc4" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.771246 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6954da61-bafb-4b35-aa61-0f120c34c747-metrics-certs\") pod \"network-metrics-daemon-kncc4\" (UID: \"6954da61-bafb-4b35-aa61-0f120c34c747\") " pod="openshift-multus/network-metrics-daemon-kncc4" Dec 04 09:43:00 crc kubenswrapper[4693]: E1204 09:43:00.771409 4693 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:43:00 crc kubenswrapper[4693]: E1204 09:43:00.771476 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6954da61-bafb-4b35-aa61-0f120c34c747-metrics-certs podName:6954da61-bafb-4b35-aa61-0f120c34c747 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:01.271457199 +0000 UTC m=+27.169050952 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6954da61-bafb-4b35-aa61-0f120c34c747-metrics-certs") pod "network-metrics-daemon-kncc4" (UID: "6954da61-bafb-4b35-aa61-0f120c34c747") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.790983 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xctsz\" (UniqueName: \"kubernetes.io/projected/6954da61-bafb-4b35-aa61-0f120c34c747-kube-api-access-xctsz\") pod \"network-metrics-daemon-kncc4\" (UID: \"6954da61-bafb-4b35-aa61-0f120c34c747\") " pod="openshift-multus/network-metrics-daemon-kncc4" Dec 04 09:43:00 crc kubenswrapper[4693]: I1204 09:43:00.827928 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77nf7" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.107393 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" event={"ID":"d6e969b8-31f1-4fbf-9597-16349612e0c0","Type":"ContainerStarted","Data":"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6"} Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.107487 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" event={"ID":"d6e969b8-31f1-4fbf-9597-16349612e0c0","Type":"ContainerStarted","Data":"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005"} Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.107514 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" event={"ID":"d6e969b8-31f1-4fbf-9597-16349612e0c0","Type":"ContainerStarted","Data":"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c"} Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.110393 4693 generic.go:334] "Generic (PLEG): container finished" podID="24bfab1f-25f4-44b3-89cf-bb7fda55c2de" containerID="e14e73a4f4c04ae7e8ff766151a10a59f4ba7adeb274381c85caba9f2787ed0a" exitCode=0 Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.110445 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cd545" event={"ID":"24bfab1f-25f4-44b3-89cf-bb7fda55c2de","Type":"ContainerDied","Data":"e14e73a4f4c04ae7e8ff766151a10a59f4ba7adeb274381c85caba9f2787ed0a"} Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.279443 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6954da61-bafb-4b35-aa61-0f120c34c747-metrics-certs\") pod \"network-metrics-daemon-kncc4\" (UID: \"6954da61-bafb-4b35-aa61-0f120c34c747\") " pod="openshift-multus/network-metrics-daemon-kncc4" Dec 04 09:43:01 crc kubenswrapper[4693]: E1204 09:43:01.279676 4693 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:43:01 crc kubenswrapper[4693]: E1204 09:43:01.279834 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6954da61-bafb-4b35-aa61-0f120c34c747-metrics-certs podName:6954da61-bafb-4b35-aa61-0f120c34c747 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:02.279795212 +0000 UTC m=+28.177389145 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6954da61-bafb-4b35-aa61-0f120c34c747-metrics-certs") pod "network-metrics-daemon-kncc4" (UID: "6954da61-bafb-4b35-aa61-0f120c34c747") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.388042 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.396179 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.401685 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.460493 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:43:01 crc kubenswrapper[4693]: E1204 09:43:01.460757 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.461020 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:43:01 crc kubenswrapper[4693]: E1204 09:43:01.461451 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.498766 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.501097 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.501173 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.501192 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.501468 4693 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.518289 4693 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.519026 4693 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.521476 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.521549 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.521561 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.521580 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.521594 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:43:01Z","lastTransitionTime":"2025-12-04T09:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.596509 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7"] Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.596932 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.599919 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.599916 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.600388 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.600398 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.635532 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=0.635498278 podStartE2EDuration="635.498278ms" podCreationTimestamp="2025-12-04 09:43:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:01.633810992 +0000 UTC m=+27.531404785" watchObservedRunningTime="2025-12-04 09:43:01.635498278 +0000 UTC m=+27.533092071" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.684419 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/87cfa58a-1bae-4902-83fa-cfdf4f57fd1d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rr4z7\" (UID: \"87cfa58a-1bae-4902-83fa-cfdf4f57fd1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.684461 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/87cfa58a-1bae-4902-83fa-cfdf4f57fd1d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rr4z7\" (UID: \"87cfa58a-1bae-4902-83fa-cfdf4f57fd1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.684488 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/87cfa58a-1bae-4902-83fa-cfdf4f57fd1d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rr4z7\" (UID: \"87cfa58a-1bae-4902-83fa-cfdf4f57fd1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.684688 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/87cfa58a-1bae-4902-83fa-cfdf4f57fd1d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rr4z7\" (UID: \"87cfa58a-1bae-4902-83fa-cfdf4f57fd1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.684780 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87cfa58a-1bae-4902-83fa-cfdf4f57fd1d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rr4z7\" (UID: \"87cfa58a-1bae-4902-83fa-cfdf4f57fd1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.785916 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87cfa58a-1bae-4902-83fa-cfdf4f57fd1d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rr4z7\" (UID: \"87cfa58a-1bae-4902-83fa-cfdf4f57fd1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.785969 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/87cfa58a-1bae-4902-83fa-cfdf4f57fd1d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rr4z7\" (UID: \"87cfa58a-1bae-4902-83fa-cfdf4f57fd1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.785995 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/87cfa58a-1bae-4902-83fa-cfdf4f57fd1d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rr4z7\" (UID: \"87cfa58a-1bae-4902-83fa-cfdf4f57fd1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.786014 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/87cfa58a-1bae-4902-83fa-cfdf4f57fd1d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rr4z7\" (UID: \"87cfa58a-1bae-4902-83fa-cfdf4f57fd1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.786075 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/87cfa58a-1bae-4902-83fa-cfdf4f57fd1d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rr4z7\" (UID: \"87cfa58a-1bae-4902-83fa-cfdf4f57fd1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.786142 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/87cfa58a-1bae-4902-83fa-cfdf4f57fd1d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rr4z7\" (UID: \"87cfa58a-1bae-4902-83fa-cfdf4f57fd1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.786210 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/87cfa58a-1bae-4902-83fa-cfdf4f57fd1d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rr4z7\" (UID: \"87cfa58a-1bae-4902-83fa-cfdf4f57fd1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.787035 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/87cfa58a-1bae-4902-83fa-cfdf4f57fd1d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rr4z7\" (UID: \"87cfa58a-1bae-4902-83fa-cfdf4f57fd1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.792673 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87cfa58a-1bae-4902-83fa-cfdf4f57fd1d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rr4z7\" (UID: \"87cfa58a-1bae-4902-83fa-cfdf4f57fd1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.807769 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/87cfa58a-1bae-4902-83fa-cfdf4f57fd1d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rr4z7\" (UID: \"87cfa58a-1bae-4902-83fa-cfdf4f57fd1d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" Dec 04 09:43:01 crc kubenswrapper[4693]: I1204 09:43:01.921550 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" Dec 04 09:43:01 crc kubenswrapper[4693]: W1204 09:43:01.938032 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87cfa58a_1bae_4902_83fa_cfdf4f57fd1d.slice/crio-3dcbce6003f0db43d991890748698dc976b4fa9ac9cd09c703498b738d1bfb42 WatchSource:0}: Error finding container 3dcbce6003f0db43d991890748698dc976b4fa9ac9cd09c703498b738d1bfb42: Status 404 returned error can't find the container with id 3dcbce6003f0db43d991890748698dc976b4fa9ac9cd09c703498b738d1bfb42 Dec 04 09:43:02 crc kubenswrapper[4693]: I1204 09:43:02.116017 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" event={"ID":"87cfa58a-1bae-4902-83fa-cfdf4f57fd1d","Type":"ContainerStarted","Data":"3dcbce6003f0db43d991890748698dc976b4fa9ac9cd09c703498b738d1bfb42"} Dec 04 09:43:02 crc kubenswrapper[4693]: E1204 09:43:02.125810 4693 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:43:02 crc kubenswrapper[4693]: I1204 09:43:02.292311 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6954da61-bafb-4b35-aa61-0f120c34c747-metrics-certs\") pod \"network-metrics-daemon-kncc4\" (UID: \"6954da61-bafb-4b35-aa61-0f120c34c747\") " pod="openshift-multus/network-metrics-daemon-kncc4" Dec 04 09:43:02 crc kubenswrapper[4693]: E1204 09:43:02.292533 4693 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:43:02 crc kubenswrapper[4693]: E1204 09:43:02.292621 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6954da61-bafb-4b35-aa61-0f120c34c747-metrics-certs podName:6954da61-bafb-4b35-aa61-0f120c34c747 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:04.292597276 +0000 UTC m=+30.190191069 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6954da61-bafb-4b35-aa61-0f120c34c747-metrics-certs") pod "network-metrics-daemon-kncc4" (UID: "6954da61-bafb-4b35-aa61-0f120c34c747") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:43:02 crc kubenswrapper[4693]: I1204 09:43:02.461155 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:02 crc kubenswrapper[4693]: I1204 09:43:02.461191 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kncc4" Dec 04 09:43:02 crc kubenswrapper[4693]: E1204 09:43:02.461716 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:43:02 crc kubenswrapper[4693]: E1204 09:43:02.462020 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kncc4" podUID="6954da61-bafb-4b35-aa61-0f120c34c747" Dec 04 09:43:03 crc kubenswrapper[4693]: I1204 09:43:03.023650 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:03 crc kubenswrapper[4693]: E1204 09:43:03.023881 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:11.023842679 +0000 UTC m=+36.921436462 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:03 crc kubenswrapper[4693]: I1204 09:43:03.024197 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:03 crc kubenswrapper[4693]: E1204 09:43:03.024318 4693 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:43:03 crc kubenswrapper[4693]: E1204 09:43:03.024408 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:11.024390963 +0000 UTC m=+36.921984716 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:43:03 crc kubenswrapper[4693]: I1204 09:43:03.123870 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77nf7" event={"ID":"5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca","Type":"ContainerStarted","Data":"ad653736d607d800134c5500074297a4b21ebaaa57b6d6526c0ecdf077b1fbfb"} Dec 04 09:43:03 crc kubenswrapper[4693]: I1204 09:43:03.124748 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:43:03 crc kubenswrapper[4693]: E1204 09:43:03.124884 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:43:03 crc kubenswrapper[4693]: E1204 09:43:03.124965 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:43:03 crc kubenswrapper[4693]: E1204 09:43:03.124980 4693 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:43:03 crc kubenswrapper[4693]: I1204 09:43:03.125147 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:03 crc kubenswrapper[4693]: E1204 09:43:03.125176 4693 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:43:03 crc kubenswrapper[4693]: E1204 09:43:03.125200 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:11.125183306 +0000 UTC m=+37.022777059 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:43:03 crc kubenswrapper[4693]: I1204 09:43:03.125588 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:43:03 crc kubenswrapper[4693]: E1204 09:43:03.125642 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:11.125626049 +0000 UTC m=+37.023219802 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:43:03 crc kubenswrapper[4693]: E1204 09:43:03.125737 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:43:03 crc kubenswrapper[4693]: E1204 09:43:03.126053 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:43:03 crc kubenswrapper[4693]: E1204 09:43:03.126158 4693 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:43:03 crc kubenswrapper[4693]: E1204 09:43:03.126290 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:11.126277936 +0000 UTC m=+37.023871689 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:43:03 crc kubenswrapper[4693]: I1204 09:43:03.460448 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:43:03 crc kubenswrapper[4693]: E1204 09:43:03.460582 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:43:03 crc kubenswrapper[4693]: I1204 09:43:03.460890 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:43:03 crc kubenswrapper[4693]: E1204 09:43:03.461283 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:43:04 crc kubenswrapper[4693]: I1204 09:43:04.127990 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" event={"ID":"87cfa58a-1bae-4902-83fa-cfdf4f57fd1d","Type":"ContainerStarted","Data":"0160ece21434f1271c8a16920e363d9f7dc1e68d221f5ed701fba4b624c52b01"} Dec 04 09:43:04 crc kubenswrapper[4693]: I1204 09:43:04.131218 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" event={"ID":"d6e969b8-31f1-4fbf-9597-16349612e0c0","Type":"ContainerStarted","Data":"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf"} Dec 04 09:43:04 crc kubenswrapper[4693]: I1204 09:43:04.133322 4693 generic.go:334] "Generic (PLEG): container finished" podID="24bfab1f-25f4-44b3-89cf-bb7fda55c2de" containerID="dbf35c3a35a2b89ce4a5752b7a30d5eb47b7fccd0304448e6635467b996f93ef" exitCode=0 Dec 04 09:43:04 crc kubenswrapper[4693]: I1204 09:43:04.133415 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cd545" event={"ID":"24bfab1f-25f4-44b3-89cf-bb7fda55c2de","Type":"ContainerDied","Data":"dbf35c3a35a2b89ce4a5752b7a30d5eb47b7fccd0304448e6635467b996f93ef"} Dec 04 09:43:04 crc kubenswrapper[4693]: I1204 09:43:04.135412 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77nf7" event={"ID":"5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca","Type":"ContainerStarted","Data":"fb6c2e981fd0dc94d28907e16a6b0f613167b90168bff46d96738671022112e6"} Dec 04 09:43:04 crc kubenswrapper[4693]: I1204 09:43:04.135440 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77nf7" event={"ID":"5617b7c3-ba0c-4a8e-beef-73e3a1fce3ca","Type":"ContainerStarted","Data":"ddc98d83d0e40efc5ef3113f7813bf25ad4dd21bf636c8e030ac481aeb46f8ff"} Dec 04 09:43:04 crc kubenswrapper[4693]: I1204 09:43:04.160024 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77nf7" podStartSLOduration=8.160006182 podStartE2EDuration="8.160006182s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:04.159064367 +0000 UTC m=+30.056658120" watchObservedRunningTime="2025-12-04 09:43:04.160006182 +0000 UTC m=+30.057599935" Dec 04 09:43:04 crc kubenswrapper[4693]: I1204 09:43:04.160371 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rr4z7" podStartSLOduration=8.160363422 podStartE2EDuration="8.160363422s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:04.144380371 +0000 UTC m=+30.041974154" watchObservedRunningTime="2025-12-04 09:43:04.160363422 +0000 UTC m=+30.057957195" Dec 04 09:43:04 crc kubenswrapper[4693]: I1204 09:43:04.224219 4693 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 04 09:43:04 crc kubenswrapper[4693]: I1204 09:43:04.337358 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6954da61-bafb-4b35-aa61-0f120c34c747-metrics-certs\") pod \"network-metrics-daemon-kncc4\" (UID: \"6954da61-bafb-4b35-aa61-0f120c34c747\") " pod="openshift-multus/network-metrics-daemon-kncc4" Dec 04 09:43:04 crc kubenswrapper[4693]: E1204 09:43:04.337512 4693 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:43:04 crc kubenswrapper[4693]: E1204 09:43:04.337721 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6954da61-bafb-4b35-aa61-0f120c34c747-metrics-certs podName:6954da61-bafb-4b35-aa61-0f120c34c747 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:08.337705705 +0000 UTC m=+34.235299458 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6954da61-bafb-4b35-aa61-0f120c34c747-metrics-certs") pod "network-metrics-daemon-kncc4" (UID: "6954da61-bafb-4b35-aa61-0f120c34c747") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:43:04 crc kubenswrapper[4693]: I1204 09:43:04.460172 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kncc4" Dec 04 09:43:04 crc kubenswrapper[4693]: I1204 09:43:04.460194 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:04 crc kubenswrapper[4693]: E1204 09:43:04.460898 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kncc4" podUID="6954da61-bafb-4b35-aa61-0f120c34c747" Dec 04 09:43:04 crc kubenswrapper[4693]: E1204 09:43:04.460958 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:43:05 crc kubenswrapper[4693]: I1204 09:43:05.460885 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:43:05 crc kubenswrapper[4693]: I1204 09:43:05.460897 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:43:05 crc kubenswrapper[4693]: E1204 09:43:05.461132 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:43:05 crc kubenswrapper[4693]: E1204 09:43:05.461009 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:43:06 crc kubenswrapper[4693]: I1204 09:43:06.148771 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cd545" event={"ID":"24bfab1f-25f4-44b3-89cf-bb7fda55c2de","Type":"ContainerStarted","Data":"ce671f9c9053409dab5362d805e692b7a972fbbe4e24d18cb87a0a41869e1905"} Dec 04 09:43:06 crc kubenswrapper[4693]: I1204 09:43:06.157255 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" event={"ID":"d6e969b8-31f1-4fbf-9597-16349612e0c0","Type":"ContainerStarted","Data":"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2"} Dec 04 09:43:06 crc kubenswrapper[4693]: I1204 09:43:06.461840 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kncc4" Dec 04 09:43:06 crc kubenswrapper[4693]: I1204 09:43:06.461876 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:06 crc kubenswrapper[4693]: E1204 09:43:06.461991 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kncc4" podUID="6954da61-bafb-4b35-aa61-0f120c34c747" Dec 04 09:43:06 crc kubenswrapper[4693]: E1204 09:43:06.462132 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:43:07 crc kubenswrapper[4693]: I1204 09:43:07.163833 4693 generic.go:334] "Generic (PLEG): container finished" podID="24bfab1f-25f4-44b3-89cf-bb7fda55c2de" containerID="ce671f9c9053409dab5362d805e692b7a972fbbe4e24d18cb87a0a41869e1905" exitCode=0 Dec 04 09:43:07 crc kubenswrapper[4693]: I1204 09:43:07.165961 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cd545" event={"ID":"24bfab1f-25f4-44b3-89cf-bb7fda55c2de","Type":"ContainerDied","Data":"ce671f9c9053409dab5362d805e692b7a972fbbe4e24d18cb87a0a41869e1905"} Dec 04 09:43:07 crc kubenswrapper[4693]: I1204 09:43:07.166104 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:43:07 crc kubenswrapper[4693]: I1204 09:43:07.166196 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:43:07 crc kubenswrapper[4693]: I1204 09:43:07.166218 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:43:07 crc kubenswrapper[4693]: I1204 09:43:07.192096 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:43:07 crc kubenswrapper[4693]: I1204 09:43:07.202796 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" podStartSLOduration=11.202773776 podStartE2EDuration="11.202773776s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:07.197642248 +0000 UTC m=+33.095236001" watchObservedRunningTime="2025-12-04 09:43:07.202773776 +0000 UTC m=+33.100367539" Dec 04 09:43:07 crc kubenswrapper[4693]: I1204 09:43:07.213977 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:43:07 crc kubenswrapper[4693]: I1204 09:43:07.460919 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:43:07 crc kubenswrapper[4693]: E1204 09:43:07.461036 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:43:07 crc kubenswrapper[4693]: I1204 09:43:07.460937 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:43:07 crc kubenswrapper[4693]: E1204 09:43:07.461199 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:43:08 crc kubenswrapper[4693]: I1204 09:43:08.169984 4693 generic.go:334] "Generic (PLEG): container finished" podID="24bfab1f-25f4-44b3-89cf-bb7fda55c2de" containerID="1ca103c451d925704c78c4170f06dd1233fb41a27bfe61c143a041afeaefed23" exitCode=0 Dec 04 09:43:08 crc kubenswrapper[4693]: I1204 09:43:08.170030 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cd545" event={"ID":"24bfab1f-25f4-44b3-89cf-bb7fda55c2de","Type":"ContainerDied","Data":"1ca103c451d925704c78c4170f06dd1233fb41a27bfe61c143a041afeaefed23"} Dec 04 09:43:08 crc kubenswrapper[4693]: I1204 09:43:08.218064 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kncc4"] Dec 04 09:43:08 crc kubenswrapper[4693]: I1204 09:43:08.218180 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kncc4" Dec 04 09:43:08 crc kubenswrapper[4693]: E1204 09:43:08.218266 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kncc4" podUID="6954da61-bafb-4b35-aa61-0f120c34c747" Dec 04 09:43:08 crc kubenswrapper[4693]: I1204 09:43:08.382891 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6954da61-bafb-4b35-aa61-0f120c34c747-metrics-certs\") pod \"network-metrics-daemon-kncc4\" (UID: \"6954da61-bafb-4b35-aa61-0f120c34c747\") " pod="openshift-multus/network-metrics-daemon-kncc4" Dec 04 09:43:08 crc kubenswrapper[4693]: E1204 09:43:08.383025 4693 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:43:08 crc kubenswrapper[4693]: E1204 09:43:08.383075 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6954da61-bafb-4b35-aa61-0f120c34c747-metrics-certs podName:6954da61-bafb-4b35-aa61-0f120c34c747 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:16.383061137 +0000 UTC m=+42.280654890 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6954da61-bafb-4b35-aa61-0f120c34c747-metrics-certs") pod "network-metrics-daemon-kncc4" (UID: "6954da61-bafb-4b35-aa61-0f120c34c747") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:43:08 crc kubenswrapper[4693]: I1204 09:43:08.460690 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:08 crc kubenswrapper[4693]: E1204 09:43:08.460812 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:43:08 crc kubenswrapper[4693]: I1204 09:43:08.550296 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:09 crc kubenswrapper[4693]: I1204 09:43:09.179345 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cd545" event={"ID":"24bfab1f-25f4-44b3-89cf-bb7fda55c2de","Type":"ContainerStarted","Data":"0cd8e4d382b7a865a7c27753e672c3efddb2391e51a303a678d77c0a9deef678"} Dec 04 09:43:09 crc kubenswrapper[4693]: I1204 09:43:09.209288 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-cd545" podStartSLOduration=13.209265357 podStartE2EDuration="13.209265357s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:09.209221746 +0000 UTC m=+35.106815499" watchObservedRunningTime="2025-12-04 09:43:09.209265357 +0000 UTC m=+35.106859130" Dec 04 09:43:09 crc kubenswrapper[4693]: I1204 09:43:09.460893 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:43:09 crc kubenswrapper[4693]: I1204 09:43:09.460900 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:43:09 crc kubenswrapper[4693]: E1204 09:43:09.461368 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:43:09 crc kubenswrapper[4693]: E1204 09:43:09.461473 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:43:10 crc kubenswrapper[4693]: I1204 09:43:10.460809 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:10 crc kubenswrapper[4693]: I1204 09:43:10.460932 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kncc4" Dec 04 09:43:10 crc kubenswrapper[4693]: E1204 09:43:10.461060 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:43:10 crc kubenswrapper[4693]: E1204 09:43:10.461218 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kncc4" podUID="6954da61-bafb-4b35-aa61-0f120c34c747" Dec 04 09:43:10 crc kubenswrapper[4693]: I1204 09:43:10.958615 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 04 09:43:10 crc kubenswrapper[4693]: I1204 09:43:10.958785 4693 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Dec 04 09:43:10 crc kubenswrapper[4693]: I1204 09:43:10.990874 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-kkwtj"] Dec 04 09:43:10 crc kubenswrapper[4693]: I1204 09:43:10.991927 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kkwtj" Dec 04 09:43:10 crc kubenswrapper[4693]: I1204 09:43:10.995230 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zjrqj"] Dec 04 09:43:10 crc kubenswrapper[4693]: I1204 09:43:10.995820 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zjrqj" Dec 04 09:43:10 crc kubenswrapper[4693]: I1204 09:43:10.996301 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-gh7dl"] Dec 04 09:43:10 crc kubenswrapper[4693]: I1204 09:43:10.996833 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-gh7dl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.003891 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9x82p"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.004580 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9x82p" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.010458 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.010652 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.010755 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.011314 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.011566 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.011915 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.012059 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.012066 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.012137 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.012137 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.012306 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.012430 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.012568 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.012702 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.012727 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.012789 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.012835 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.012843 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.013023 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.013129 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.012990 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.015242 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.016194 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.016641 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.016954 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5fvns"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.022156 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.025271 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.025931 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jznlz"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.026428 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-sg997"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.028675 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5fvns" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.029155 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.029411 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kn9rb"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.029495 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.044880 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-sg997" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.045900 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.047728 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.048303 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.050258 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.050896 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.051005 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tzx57"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.052234 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.052933 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-mhzjn"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.053070 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.053407 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.053442 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.053524 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.053608 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.053692 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.053792 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.054137 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.054360 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.054465 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.054535 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.054785 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.056131 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vflmh"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.054819 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.054921 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.054986 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.055145 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.055182 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.055286 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.055402 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.055457 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.057116 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.064404 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zctpg"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.064933 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mnzz8"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.065294 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.065881 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vflmh" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.066064 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zctpg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.066090 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.066444 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.068199 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7jlpl"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.068739 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7jlpl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.069023 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qn6gc"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.069891 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qn6gc" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.070135 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.070293 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.071673 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.071903 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.072101 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.072130 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.072867 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.073526 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.073950 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.074075 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.074091 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.074381 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.074500 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.074770 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.076191 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.076584 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.076645 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.076836 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.076998 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.077098 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.077202 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.077295 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.077405 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.077533 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.077727 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.077836 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.077908 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.080929 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.081543 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.084995 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.085220 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.085885 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.093437 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.096739 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.096811 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.097021 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.097138 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.097195 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.097595 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.097822 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.098221 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.098416 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.098587 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.116022 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.116165 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.116449 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.116474 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.116449 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.116598 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.116696 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117072 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117123 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117248 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3246be0-0c88-49c2-8cee-05c3661a509e-serving-cert\") pod \"console-operator-58897d9998-zjrqj\" (UID: \"b3246be0-0c88-49c2-8cee-05c3661a509e\") " pod="openshift-console-operator/console-operator-58897d9998-zjrqj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117275 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68219a06-b58a-4d36-b851-32dd1e4a2ec5-config\") pod \"route-controller-manager-6576b87f9c-5dnpl\" (UID: \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117291 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/764c0924-2f3b-4341-9922-a22d2f3cf145-audit-dir\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117309 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7g8k\" (UniqueName: \"kubernetes.io/projected/5c8b51b7-718d-43b5-9e18-58966747279f-kube-api-access-h7g8k\") pod \"machine-approver-56656f9798-kkwtj\" (UID: \"5c8b51b7-718d-43b5-9e18-58966747279f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kkwtj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117326 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3246be0-0c88-49c2-8cee-05c3661a509e-config\") pod \"console-operator-58897d9998-zjrqj\" (UID: \"b3246be0-0c88-49c2-8cee-05c3661a509e\") " pod="openshift-console-operator/console-operator-58897d9998-zjrqj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117359 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-audit-dir\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117374 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117390 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a14b590-276f-49be-961a-c459c975c8ab-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5fvns\" (UID: \"6a14b590-276f-49be-961a-c459c975c8ab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5fvns" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117405 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-audit-policies\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117419 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5c8b51b7-718d-43b5-9e18-58966747279f-machine-approver-tls\") pod \"machine-approver-56656f9798-kkwtj\" (UID: \"5c8b51b7-718d-43b5-9e18-58966747279f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kkwtj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117442 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68219a06-b58a-4d36-b851-32dd1e4a2ec5-client-ca\") pod \"route-controller-manager-6576b87f9c-5dnpl\" (UID: \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117456 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117473 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5c8b51b7-718d-43b5-9e18-58966747279f-auth-proxy-config\") pod \"machine-approver-56656f9798-kkwtj\" (UID: \"5c8b51b7-718d-43b5-9e18-58966747279f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kkwtj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117488 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117503 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-etcd-client\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117522 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9mrh\" (UniqueName: \"kubernetes.io/projected/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-kube-api-access-g9mrh\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117537 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/46a329a4-a450-4e39-bcbe-c7dcba1e6939-images\") pod \"machine-api-operator-5694c8668f-gh7dl\" (UID: \"46a329a4-a450-4e39-bcbe-c7dcba1e6939\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gh7dl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117552 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjf4n\" (UniqueName: \"kubernetes.io/projected/6a14b590-276f-49be-961a-c459c975c8ab-kube-api-access-rjf4n\") pod \"cluster-samples-operator-665b6dd947-5fvns\" (UID: \"6a14b590-276f-49be-961a-c459c975c8ab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5fvns" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117562 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: E1204 09:43:11.117639 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.117626096 +0000 UTC m=+53.015219849 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117567 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117753 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117771 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68219a06-b58a-4d36-b851-32dd1e4a2ec5-serving-cert\") pod \"route-controller-manager-6576b87f9c-5dnpl\" (UID: \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117786 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117802 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq8n4\" (UniqueName: \"kubernetes.io/projected/46a329a4-a450-4e39-bcbe-c7dcba1e6939-kube-api-access-tq8n4\") pod \"machine-api-operator-5694c8668f-gh7dl\" (UID: \"46a329a4-a450-4e39-bcbe-c7dcba1e6939\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gh7dl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117818 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngdqf\" (UniqueName: \"kubernetes.io/projected/764c0924-2f3b-4341-9922-a22d2f3cf145-kube-api-access-ngdqf\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117846 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czwj4\" (UniqueName: \"kubernetes.io/projected/57ffaa1b-fb6e-4c7b-8809-ca9bbd8f527e-kube-api-access-czwj4\") pod \"downloads-7954f5f757-sg997\" (UID: \"57ffaa1b-fb6e-4c7b-8809-ca9bbd8f527e\") " pod="openshift-console/downloads-7954f5f757-sg997" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117869 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljqhr\" (UniqueName: \"kubernetes.io/projected/f30a48a3-da96-4844-8a96-3478db0a7018-kube-api-access-ljqhr\") pod \"openshift-apiserver-operator-796bbdcf4f-9x82p\" (UID: \"f30a48a3-da96-4844-8a96-3478db0a7018\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9x82p" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117887 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117893 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117903 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117920 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f30a48a3-da96-4844-8a96-3478db0a7018-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9x82p\" (UID: \"f30a48a3-da96-4844-8a96-3478db0a7018\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9x82p" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117935 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/46a329a4-a450-4e39-bcbe-c7dcba1e6939-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-gh7dl\" (UID: \"46a329a4-a450-4e39-bcbe-c7dcba1e6939\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gh7dl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117949 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117965 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46a329a4-a450-4e39-bcbe-c7dcba1e6939-config\") pod \"machine-api-operator-5694c8668f-gh7dl\" (UID: \"46a329a4-a450-4e39-bcbe-c7dcba1e6939\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gh7dl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117978 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-audit-policies\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.117993 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.118008 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.118023 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q98r9\" (UniqueName: \"kubernetes.io/projected/b3246be0-0c88-49c2-8cee-05c3661a509e-kube-api-access-q98r9\") pod \"console-operator-58897d9998-zjrqj\" (UID: \"b3246be0-0c88-49c2-8cee-05c3661a509e\") " pod="openshift-console-operator/console-operator-58897d9998-zjrqj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.118038 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp6xh\" (UniqueName: \"kubernetes.io/projected/68219a06-b58a-4d36-b851-32dd1e4a2ec5-kube-api-access-dp6xh\") pod \"route-controller-manager-6576b87f9c-5dnpl\" (UID: \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.118053 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3246be0-0c88-49c2-8cee-05c3661a509e-trusted-ca\") pod \"console-operator-58897d9998-zjrqj\" (UID: \"b3246be0-0c88-49c2-8cee-05c3661a509e\") " pod="openshift-console-operator/console-operator-58897d9998-zjrqj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.118068 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.118083 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.118098 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-serving-cert\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.118111 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-encryption-config\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.118126 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f30a48a3-da96-4844-8a96-3478db0a7018-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9x82p\" (UID: \"f30a48a3-da96-4844-8a96-3478db0a7018\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9x82p" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.118141 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.118157 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c8b51b7-718d-43b5-9e18-58966747279f-config\") pod \"machine-approver-56656f9798-kkwtj\" (UID: \"5c8b51b7-718d-43b5-9e18-58966747279f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kkwtj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.118209 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 04 09:43:11 crc kubenswrapper[4693]: E1204 09:43:11.118261 4693 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:43:11 crc kubenswrapper[4693]: E1204 09:43:11.118289 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.118282113 +0000 UTC m=+53.015875866 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.118301 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.118689 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.119266 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.119647 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.121995 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zntf6"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.122496 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zntf6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.123436 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65lkg"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.124143 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65lkg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.124627 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.124707 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.125637 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.126954 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.127118 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.132634 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.133234 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.135881 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-55r8b"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.136496 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.136500 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.137795 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mq9k6"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.138439 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mq9k6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.139927 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.144940 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hzbkw"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.145607 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hzbkw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.145823 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.148279 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jt5h7"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.148661 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.156671 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zjrqj"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.156719 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n9k5f"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.171904 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-rhtg6"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.172076 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n9k5f" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.172538 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.172656 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zkq2l"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.172820 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rhtg6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.173146 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l5tpz"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.173424 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.174074 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lxk9g"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.174223 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.174721 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.176381 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gc6fl"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.176512 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-lxk9g" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.181509 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g72br"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.181633 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gc6fl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.182187 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-gh7dl"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.182220 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6jrqg"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.182247 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g72br" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.182962 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6jrqg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.183627 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9x82p"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.184831 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-74srj"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.185271 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.186046 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wjxqk"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.186900 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wjxqk" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.187367 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.188976 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75b6q"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.189528 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75b6q" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.190173 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.191016 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vflmh"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.192180 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5fvns"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.193346 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.194072 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.194991 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-z9fxv"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.196108 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-sg997"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.196201 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z9fxv" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.197134 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qn6gc"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.199739 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kn9rb"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.201289 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.202637 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-94np6"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.203268 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-94np6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.204403 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jznlz"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.205632 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n9k5f"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.206886 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tzx57"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.208057 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.209236 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hzbkw"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.210624 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mq9k6"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.213661 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-mhzjn"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.213952 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.215294 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.216977 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7jlpl"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.218573 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gc6fl"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219002 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/debdfef3-4184-4a37-a818-e5c41c81e2fd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hzbkw\" (UID: \"debdfef3-4184-4a37-a818-e5c41c81e2fd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hzbkw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219038 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnk99\" (UniqueName: \"kubernetes.io/projected/debdfef3-4184-4a37-a818-e5c41c81e2fd-kube-api-access-nnk99\") pod \"multus-admission-controller-857f4d67dd-hzbkw\" (UID: \"debdfef3-4184-4a37-a818-e5c41c81e2fd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hzbkw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219066 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39c1db76-c273-46ee-a00e-3dae4dc1ed6b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lfjld\" (UID: \"39c1db76-c273-46ee-a00e-3dae4dc1ed6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219096 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5c8b51b7-718d-43b5-9e18-58966747279f-auth-proxy-config\") pod \"machine-approver-56656f9798-kkwtj\" (UID: \"5c8b51b7-718d-43b5-9e18-58966747279f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kkwtj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219122 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5c8b51b7-718d-43b5-9e18-58966747279f-machine-approver-tls\") pod \"machine-approver-56656f9798-kkwtj\" (UID: \"5c8b51b7-718d-43b5-9e18-58966747279f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kkwtj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219148 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ttw5\" (UniqueName: \"kubernetes.io/projected/39c1db76-c273-46ee-a00e-3dae4dc1ed6b-kube-api-access-6ttw5\") pod \"cluster-image-registry-operator-dc59b4c8b-lfjld\" (UID: \"39c1db76-c273-46ee-a00e-3dae4dc1ed6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219172 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219197 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68219a06-b58a-4d36-b851-32dd1e4a2ec5-client-ca\") pod \"route-controller-manager-6576b87f9c-5dnpl\" (UID: \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219221 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219243 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219266 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39c1db76-c273-46ee-a00e-3dae4dc1ed6b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lfjld\" (UID: \"39c1db76-c273-46ee-a00e-3dae4dc1ed6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219287 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-audit\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219310 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/833b50c0-572f-4534-8a96-af514ff81953-service-ca-bundle\") pod \"router-default-5444994796-55r8b\" (UID: \"833b50c0-572f-4534-8a96-af514ff81953\") " pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219359 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9mrh\" (UniqueName: \"kubernetes.io/projected/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-kube-api-access-g9mrh\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219387 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78528eec-0bba-40f0-9739-0a4e951d53da-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vflmh\" (UID: \"78528eec-0bba-40f0-9739-0a4e951d53da\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vflmh" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219409 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a96a1772-5a62-4b57-bc83-4faa4dcb1260-config\") pod \"kube-apiserver-operator-766d6c64bb-zntf6\" (UID: \"a96a1772-5a62-4b57-bc83-4faa4dcb1260\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zntf6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219438 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-etcd-client\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219459 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219480 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/46a329a4-a450-4e39-bcbe-c7dcba1e6939-images\") pod \"machine-api-operator-5694c8668f-gh7dl\" (UID: \"46a329a4-a450-4e39-bcbe-c7dcba1e6939\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gh7dl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219502 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57e20f16-dfe1-45b4-8b13-860011aac931-trusted-ca\") pod \"ingress-operator-5b745b69d9-hz7pw\" (UID: \"57e20f16-dfe1-45b4-8b13-860011aac931\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219523 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c46aadf-ba22-4bdc-b76a-8b9ad8880368-serving-cert\") pod \"openshift-config-operator-7777fb866f-zctpg\" (UID: \"1c46aadf-ba22-4bdc-b76a-8b9ad8880368\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zctpg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219543 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a85a548f-a204-4573-ae20-c62a17e17df3-apiservice-cert\") pod \"packageserver-d55dfcdfc-7jdxk\" (UID: \"a85a548f-a204-4573-ae20-c62a17e17df3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219563 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-audit-dir\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219590 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjf4n\" (UniqueName: \"kubernetes.io/projected/6a14b590-276f-49be-961a-c459c975c8ab-kube-api-access-rjf4n\") pod \"cluster-samples-operator-665b6dd947-5fvns\" (UID: \"6a14b590-276f-49be-961a-c459c975c8ab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5fvns" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219614 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9v82\" (UniqueName: \"kubernetes.io/projected/57e20f16-dfe1-45b4-8b13-860011aac931-kube-api-access-s9v82\") pod \"ingress-operator-5b745b69d9-hz7pw\" (UID: \"57e20f16-dfe1-45b4-8b13-860011aac931\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219678 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjpqc\" (UniqueName: \"kubernetes.io/projected/d5fc4cb9-89af-47dd-b38a-2378c774a9c5-kube-api-access-qjpqc\") pod \"authentication-operator-69f744f599-kn9rb\" (UID: \"d5fc4cb9-89af-47dd-b38a-2378c774a9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219701 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a85a548f-a204-4573-ae20-c62a17e17df3-tmpfs\") pod \"packageserver-d55dfcdfc-7jdxk\" (UID: \"a85a548f-a204-4573-ae20-c62a17e17df3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219725 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219749 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219758 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5c8b51b7-718d-43b5-9e18-58966747279f-auth-proxy-config\") pod \"machine-approver-56656f9798-kkwtj\" (UID: \"5c8b51b7-718d-43b5-9e18-58966747279f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kkwtj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219774 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/833b50c0-572f-4534-8a96-af514ff81953-metrics-certs\") pod \"router-default-5444994796-55r8b\" (UID: \"833b50c0-572f-4534-8a96-af514ff81953\") " pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219800 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68219a06-b58a-4d36-b851-32dd1e4a2ec5-serving-cert\") pod \"route-controller-manager-6576b87f9c-5dnpl\" (UID: \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219824 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219849 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq8n4\" (UniqueName: \"kubernetes.io/projected/46a329a4-a450-4e39-bcbe-c7dcba1e6939-kube-api-access-tq8n4\") pod \"machine-api-operator-5694c8668f-gh7dl\" (UID: \"46a329a4-a450-4e39-bcbe-c7dcba1e6939\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gh7dl" Dec 04 09:43:11 crc kubenswrapper[4693]: E1204 09:43:11.219867 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:43:11 crc kubenswrapper[4693]: E1204 09:43:11.219883 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:43:11 crc kubenswrapper[4693]: E1204 09:43:11.219892 4693 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:43:11 crc kubenswrapper[4693]: E1204 09:43:11.219932 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.21991799 +0000 UTC m=+53.117511743 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.219870 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngdqf\" (UniqueName: \"kubernetes.io/projected/764c0924-2f3b-4341-9922-a22d2f3cf145-kube-api-access-ngdqf\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.220298 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.220357 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.220389 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czwj4\" (UniqueName: \"kubernetes.io/projected/57ffaa1b-fb6e-4c7b-8809-ca9bbd8f527e-kube-api-access-czwj4\") pod \"downloads-7954f5f757-sg997\" (UID: \"57ffaa1b-fb6e-4c7b-8809-ca9bbd8f527e\") " pod="openshift-console/downloads-7954f5f757-sg997" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.220418 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a85a548f-a204-4573-ae20-c62a17e17df3-webhook-cert\") pod \"packageserver-d55dfcdfc-7jdxk\" (UID: \"a85a548f-a204-4573-ae20-c62a17e17df3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.220441 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-config\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.220459 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1c46aadf-ba22-4bdc-b76a-8b9ad8880368-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zctpg\" (UID: \"1c46aadf-ba22-4bdc-b76a-8b9ad8880368\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zctpg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.220475 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a0d11105-8b91-4aad-8c32-dfe2ef976028-srv-cert\") pod \"catalog-operator-68c6474976-pstsq\" (UID: \"a0d11105-8b91-4aad-8c32-dfe2ef976028\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.220572 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5fc4cb9-89af-47dd-b38a-2378c774a9c5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kn9rb\" (UID: \"d5fc4cb9-89af-47dd-b38a-2378c774a9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.220611 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljqhr\" (UniqueName: \"kubernetes.io/projected/f30a48a3-da96-4844-8a96-3478db0a7018-kube-api-access-ljqhr\") pod \"openshift-apiserver-operator-796bbdcf4f-9x82p\" (UID: \"f30a48a3-da96-4844-8a96-3478db0a7018\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9x82p" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.220647 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-trusted-ca-bundle\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: E1204 09:43:11.220781 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:43:11 crc kubenswrapper[4693]: E1204 09:43:11.220796 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:43:11 crc kubenswrapper[4693]: E1204 09:43:11.220808 4693 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:43:11 crc kubenswrapper[4693]: E1204 09:43:11.220842 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.220830043 +0000 UTC m=+53.118423796 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:43:11 crc kubenswrapper[4693]: E1204 09:43:11.220870 4693 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.220876 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68219a06-b58a-4d36-b851-32dd1e4a2ec5-client-ca\") pod \"route-controller-manager-6576b87f9c-5dnpl\" (UID: \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" Dec 04 09:43:11 crc kubenswrapper[4693]: E1204 09:43:11.220899 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.220889566 +0000 UTC m=+53.118483399 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.220934 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65lkg"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.220952 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.220983 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-etcd-serving-ca\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.221037 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-etcd-client\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.221065 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/833b50c0-572f-4534-8a96-af514ff81953-default-certificate\") pod \"router-default-5444994796-55r8b\" (UID: \"833b50c0-572f-4534-8a96-af514ff81953\") " pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.221142 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.221187 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-console-config\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.221450 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f30a48a3-da96-4844-8a96-3478db0a7018-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9x82p\" (UID: \"f30a48a3-da96-4844-8a96-3478db0a7018\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9x82p" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.221498 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9dm8\" (UniqueName: \"kubernetes.io/projected/36782e8d-b271-46a5-8f96-8979022991f2-kube-api-access-g9dm8\") pod \"collect-profiles-29414010-4vpsw\" (UID: \"36782e8d-b271-46a5-8f96-8979022991f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.221649 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.221741 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-console-oauth-config\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.221789 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-image-import-ca\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.221824 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/46a329a4-a450-4e39-bcbe-c7dcba1e6939-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-gh7dl\" (UID: \"46a329a4-a450-4e39-bcbe-c7dcba1e6939\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gh7dl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.221998 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222044 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/46a329a4-a450-4e39-bcbe-c7dcba1e6939-images\") pod \"machine-api-operator-5694c8668f-gh7dl\" (UID: \"46a329a4-a450-4e39-bcbe-c7dcba1e6939\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gh7dl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222222 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a96a1772-5a62-4b57-bc83-4faa4dcb1260-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zntf6\" (UID: \"a96a1772-5a62-4b57-bc83-4faa4dcb1260\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zntf6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222261 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-audit-policies\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222323 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46a329a4-a450-4e39-bcbe-c7dcba1e6939-config\") pod \"machine-api-operator-5694c8668f-gh7dl\" (UID: \"46a329a4-a450-4e39-bcbe-c7dcba1e6939\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gh7dl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222425 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57e20f16-dfe1-45b4-8b13-860011aac931-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hz7pw\" (UID: \"57e20f16-dfe1-45b4-8b13-860011aac931\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222448 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cxbb\" (UniqueName: \"kubernetes.io/projected/e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8-kube-api-access-5cxbb\") pod \"machine-config-operator-74547568cd-wkns5\" (UID: \"e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222475 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52cdcb0-2915-493e-ab48-7c863e590ee2-config\") pod \"kube-controller-manager-operator-78b949d7b-mq9k6\" (UID: \"c52cdcb0-2915-493e-ab48-7c863e590ee2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mq9k6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222500 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222556 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222577 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q98r9\" (UniqueName: \"kubernetes.io/projected/b3246be0-0c88-49c2-8cee-05c3661a509e-kube-api-access-q98r9\") pod \"console-operator-58897d9998-zjrqj\" (UID: \"b3246be0-0c88-49c2-8cee-05c3661a509e\") " pod="openshift-console-operator/console-operator-58897d9998-zjrqj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222600 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp6xh\" (UniqueName: \"kubernetes.io/projected/68219a06-b58a-4d36-b851-32dd1e4a2ec5-kube-api-access-dp6xh\") pod \"route-controller-manager-6576b87f9c-5dnpl\" (UID: \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222620 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3246be0-0c88-49c2-8cee-05c3661a509e-trusted-ca\") pod \"console-operator-58897d9998-zjrqj\" (UID: \"b3246be0-0c88-49c2-8cee-05c3661a509e\") " pod="openshift-console-operator/console-operator-58897d9998-zjrqj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222694 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgv6b\" (UniqueName: \"kubernetes.io/projected/a0d11105-8b91-4aad-8c32-dfe2ef976028-kube-api-access-jgv6b\") pod \"catalog-operator-68c6474976-pstsq\" (UID: \"a0d11105-8b91-4aad-8c32-dfe2ef976028\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222718 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m47l5\" (UniqueName: \"kubernetes.io/projected/78528eec-0bba-40f0-9739-0a4e951d53da-kube-api-access-m47l5\") pod \"openshift-controller-manager-operator-756b6f6bc6-vflmh\" (UID: \"78528eec-0bba-40f0-9739-0a4e951d53da\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vflmh" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222742 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5fc4cb9-89af-47dd-b38a-2378c774a9c5-serving-cert\") pod \"authentication-operator-69f744f599-kn9rb\" (UID: \"d5fc4cb9-89af-47dd-b38a-2378c774a9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222772 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222820 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a96a1772-5a62-4b57-bc83-4faa4dcb1260-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zntf6\" (UID: \"a96a1772-5a62-4b57-bc83-4faa4dcb1260\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zntf6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222842 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-serving-cert\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222847 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-audit-policies\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222863 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5x6q\" (UniqueName: \"kubernetes.io/projected/833b50c0-572f-4534-8a96-af514ff81953-kube-api-access-z5x6q\") pod \"router-default-5444994796-55r8b\" (UID: \"833b50c0-572f-4534-8a96-af514ff81953\") " pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222891 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222910 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57e20f16-dfe1-45b4-8b13-860011aac931-metrics-tls\") pod \"ingress-operator-5b745b69d9-hz7pw\" (UID: \"57e20f16-dfe1-45b4-8b13-860011aac931\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222931 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-serving-cert\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.222976 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a0d11105-8b91-4aad-8c32-dfe2ef976028-profile-collector-cert\") pod \"catalog-operator-68c6474976-pstsq\" (UID: \"a0d11105-8b91-4aad-8c32-dfe2ef976028\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223003 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1a2d7ab3-ae2e-4c16-a1c9-97997ded9506-srv-cert\") pod \"olm-operator-6b444d44fb-65lkg\" (UID: \"1a2d7ab3-ae2e-4c16-a1c9-97997ded9506\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65lkg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223021 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/833b50c0-572f-4534-8a96-af514ff81953-stats-auth\") pod \"router-default-5444994796-55r8b\" (UID: \"833b50c0-572f-4534-8a96-af514ff81953\") " pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223044 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223065 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-encryption-config\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223087 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c52cdcb0-2915-493e-ab48-7c863e590ee2-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mq9k6\" (UID: \"c52cdcb0-2915-493e-ab48-7c863e590ee2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mq9k6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223110 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f30a48a3-da96-4844-8a96-3478db0a7018-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9x82p\" (UID: \"f30a48a3-da96-4844-8a96-3478db0a7018\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9x82p" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223133 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhdnh\" (UniqueName: \"kubernetes.io/projected/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-kube-api-access-nhdnh\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223153 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-encryption-config\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223175 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c8b51b7-718d-43b5-9e18-58966747279f-config\") pod \"machine-approver-56656f9798-kkwtj\" (UID: \"5c8b51b7-718d-43b5-9e18-58966747279f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kkwtj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223227 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zkq2l"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223254 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1a2d7ab3-ae2e-4c16-a1c9-97997ded9506-profile-collector-cert\") pod \"olm-operator-6b444d44fb-65lkg\" (UID: \"1a2d7ab3-ae2e-4c16-a1c9-97997ded9506\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65lkg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223279 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36782e8d-b271-46a5-8f96-8979022991f2-secret-volume\") pod \"collect-profiles-29414010-4vpsw\" (UID: \"36782e8d-b271-46a5-8f96-8979022991f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223305 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36782e8d-b271-46a5-8f96-8979022991f2-config-volume\") pod \"collect-profiles-29414010-4vpsw\" (UID: \"36782e8d-b271-46a5-8f96-8979022991f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223327 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3246be0-0c88-49c2-8cee-05c3661a509e-serving-cert\") pod \"console-operator-58897d9998-zjrqj\" (UID: \"b3246be0-0c88-49c2-8cee-05c3661a509e\") " pod="openshift-console-operator/console-operator-58897d9998-zjrqj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223367 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7rwh\" (UniqueName: \"kubernetes.io/projected/1a2d7ab3-ae2e-4c16-a1c9-97997ded9506-kube-api-access-l7rwh\") pod \"olm-operator-6b444d44fb-65lkg\" (UID: \"1a2d7ab3-ae2e-4c16-a1c9-97997ded9506\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65lkg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223393 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp27r\" (UniqueName: \"kubernetes.io/projected/a85a548f-a204-4573-ae20-c62a17e17df3-kube-api-access-lp27r\") pod \"packageserver-d55dfcdfc-7jdxk\" (UID: \"a85a548f-a204-4573-ae20-c62a17e17df3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223421 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5fc4cb9-89af-47dd-b38a-2378c774a9c5-service-ca-bundle\") pod \"authentication-operator-69f744f599-kn9rb\" (UID: \"d5fc4cb9-89af-47dd-b38a-2378c774a9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223445 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79fm6\" (UniqueName: \"kubernetes.io/projected/8d1cb4d7-e1d9-4755-bee4-c571f8cffcba-kube-api-access-79fm6\") pod \"migrator-59844c95c7-qn6gc\" (UID: \"8d1cb4d7-e1d9-4755-bee4-c571f8cffcba\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qn6gc" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223466 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-node-pullsecrets\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223520 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223593 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68219a06-b58a-4d36-b851-32dd1e4a2ec5-config\") pod \"route-controller-manager-6576b87f9c-5dnpl\" (UID: \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223700 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/764c0924-2f3b-4341-9922-a22d2f3cf145-audit-dir\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223727 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-console-serving-cert\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223772 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7g8k\" (UniqueName: \"kubernetes.io/projected/5c8b51b7-718d-43b5-9e18-58966747279f-kube-api-access-h7g8k\") pod \"machine-approver-56656f9798-kkwtj\" (UID: \"5c8b51b7-718d-43b5-9e18-58966747279f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kkwtj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223792 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3246be0-0c88-49c2-8cee-05c3661a509e-config\") pod \"console-operator-58897d9998-zjrqj\" (UID: \"b3246be0-0c88-49c2-8cee-05c3661a509e\") " pod="openshift-console-operator/console-operator-58897d9998-zjrqj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223809 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-audit-dir\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223849 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39c1db76-c273-46ee-a00e-3dae4dc1ed6b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lfjld\" (UID: \"39c1db76-c273-46ee-a00e-3dae4dc1ed6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223870 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223899 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a14b590-276f-49be-961a-c459c975c8ab-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5fvns\" (UID: \"6a14b590-276f-49be-961a-c459c975c8ab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5fvns" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223940 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78528eec-0bba-40f0-9739-0a4e951d53da-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vflmh\" (UID: \"78528eec-0bba-40f0-9739-0a4e951d53da\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vflmh" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223958 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8-proxy-tls\") pod \"machine-config-operator-74547568cd-wkns5\" (UID: \"e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.229503 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-service-ca\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.229591 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-oauth-serving-cert\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.229559 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46a329a4-a450-4e39-bcbe-c7dcba1e6939-config\") pod \"machine-api-operator-5694c8668f-gh7dl\" (UID: \"46a329a4-a450-4e39-bcbe-c7dcba1e6939\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gh7dl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.229659 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-audit-policies\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.229693 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vg2f\" (UniqueName: \"kubernetes.io/projected/1c46aadf-ba22-4bdc-b76a-8b9ad8880368-kube-api-access-9vg2f\") pod \"openshift-config-operator-7777fb866f-zctpg\" (UID: \"1c46aadf-ba22-4bdc-b76a-8b9ad8880368\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zctpg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.232887 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.233675 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/46a329a4-a450-4e39-bcbe-c7dcba1e6939-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-gh7dl\" (UID: \"46a329a4-a450-4e39-bcbe-c7dcba1e6939\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gh7dl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.233843 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.234022 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.223985 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c8b51b7-718d-43b5-9e18-58966747279f-config\") pod \"machine-approver-56656f9798-kkwtj\" (UID: \"5c8b51b7-718d-43b5-9e18-58966747279f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kkwtj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.234563 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5fc4cb9-89af-47dd-b38a-2378c774a9c5-config\") pod \"authentication-operator-69f744f599-kn9rb\" (UID: \"d5fc4cb9-89af-47dd-b38a-2378c774a9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.234599 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f30a48a3-da96-4844-8a96-3478db0a7018-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9x82p\" (UID: \"f30a48a3-da96-4844-8a96-3478db0a7018\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9x82p" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.234626 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c52cdcb0-2915-493e-ab48-7c863e590ee2-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mq9k6\" (UID: \"c52cdcb0-2915-493e-ab48-7c863e590ee2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mq9k6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.234648 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc5wk\" (UniqueName: \"kubernetes.io/projected/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-kube-api-access-fc5wk\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.234676 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8-images\") pod \"machine-config-operator-74547568cd-wkns5\" (UID: \"e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.234702 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wkns5\" (UID: \"e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.234834 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.234943 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.235009 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-etcd-client\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.235233 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5c8b51b7-718d-43b5-9e18-58966747279f-machine-approver-tls\") pod \"machine-approver-56656f9798-kkwtj\" (UID: \"5c8b51b7-718d-43b5-9e18-58966747279f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kkwtj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.235406 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.235510 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-audit-policies\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.236186 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.236488 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.236597 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3246be0-0c88-49c2-8cee-05c3661a509e-serving-cert\") pod \"console-operator-58897d9998-zjrqj\" (UID: \"b3246be0-0c88-49c2-8cee-05c3661a509e\") " pod="openshift-console-operator/console-operator-58897d9998-zjrqj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.236800 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3246be0-0c88-49c2-8cee-05c3661a509e-trusted-ca\") pod \"console-operator-58897d9998-zjrqj\" (UID: \"b3246be0-0c88-49c2-8cee-05c3661a509e\") " pod="openshift-console-operator/console-operator-58897d9998-zjrqj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.236865 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-audit-dir\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.237170 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f30a48a3-da96-4844-8a96-3478db0a7018-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9x82p\" (UID: \"f30a48a3-da96-4844-8a96-3478db0a7018\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9x82p" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.236898 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/764c0924-2f3b-4341-9922-a22d2f3cf145-audit-dir\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.238231 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68219a06-b58a-4d36-b851-32dd1e4a2ec5-config\") pod \"route-controller-manager-6576b87f9c-5dnpl\" (UID: \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.238475 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3246be0-0c88-49c2-8cee-05c3661a509e-config\") pod \"console-operator-58897d9998-zjrqj\" (UID: \"b3246be0-0c88-49c2-8cee-05c3661a509e\") " pod="openshift-console-operator/console-operator-58897d9998-zjrqj" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.238668 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.238773 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68219a06-b58a-4d36-b851-32dd1e4a2ec5-serving-cert\") pod \"route-controller-manager-6576b87f9c-5dnpl\" (UID: \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.240667 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zntf6"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.241734 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a14b590-276f-49be-961a-c459c975c8ab-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5fvns\" (UID: \"6a14b590-276f-49be-961a-c459c975c8ab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5fvns" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.241966 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.243902 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.247415 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-encryption-config\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.246618 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jt5h7"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.247878 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-serving-cert\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.248979 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.250993 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zctpg"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.252458 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mnzz8"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.253263 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.254002 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-rhtg6"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.254519 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.255207 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g72br"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.256282 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lxk9g"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.257453 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l5tpz"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.258711 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6jrqg"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.260524 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z9fxv"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.261547 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wjxqk"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.262553 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75b6q"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.263588 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5vvnq"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.264981 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5vvnq"] Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.265064 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.275491 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.294596 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.314506 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.336850 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a96a1772-5a62-4b57-bc83-4faa4dcb1260-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zntf6\" (UID: \"a96a1772-5a62-4b57-bc83-4faa4dcb1260\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zntf6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.336887 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-console-oauth-config\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.336910 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-image-import-ca\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.336938 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57e20f16-dfe1-45b4-8b13-860011aac931-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hz7pw\" (UID: \"57e20f16-dfe1-45b4-8b13-860011aac931\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.336959 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cxbb\" (UniqueName: \"kubernetes.io/projected/e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8-kube-api-access-5cxbb\") pod \"machine-config-operator-74547568cd-wkns5\" (UID: \"e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.336980 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52cdcb0-2915-493e-ab48-7c863e590ee2-config\") pod \"kube-controller-manager-operator-78b949d7b-mq9k6\" (UID: \"c52cdcb0-2915-493e-ab48-7c863e590ee2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mq9k6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337006 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgv6b\" (UniqueName: \"kubernetes.io/projected/a0d11105-8b91-4aad-8c32-dfe2ef976028-kube-api-access-jgv6b\") pod \"catalog-operator-68c6474976-pstsq\" (UID: \"a0d11105-8b91-4aad-8c32-dfe2ef976028\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337029 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m47l5\" (UniqueName: \"kubernetes.io/projected/78528eec-0bba-40f0-9739-0a4e951d53da-kube-api-access-m47l5\") pod \"openshift-controller-manager-operator-756b6f6bc6-vflmh\" (UID: \"78528eec-0bba-40f0-9739-0a4e951d53da\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vflmh" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337050 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5fc4cb9-89af-47dd-b38a-2378c774a9c5-serving-cert\") pod \"authentication-operator-69f744f599-kn9rb\" (UID: \"d5fc4cb9-89af-47dd-b38a-2378c774a9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337068 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57e20f16-dfe1-45b4-8b13-860011aac931-metrics-tls\") pod \"ingress-operator-5b745b69d9-hz7pw\" (UID: \"57e20f16-dfe1-45b4-8b13-860011aac931\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337085 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a96a1772-5a62-4b57-bc83-4faa4dcb1260-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zntf6\" (UID: \"a96a1772-5a62-4b57-bc83-4faa4dcb1260\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zntf6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337104 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-serving-cert\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337126 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5x6q\" (UniqueName: \"kubernetes.io/projected/833b50c0-572f-4534-8a96-af514ff81953-kube-api-access-z5x6q\") pod \"router-default-5444994796-55r8b\" (UID: \"833b50c0-572f-4534-8a96-af514ff81953\") " pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337145 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a0d11105-8b91-4aad-8c32-dfe2ef976028-profile-collector-cert\") pod \"catalog-operator-68c6474976-pstsq\" (UID: \"a0d11105-8b91-4aad-8c32-dfe2ef976028\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337175 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1a2d7ab3-ae2e-4c16-a1c9-97997ded9506-srv-cert\") pod \"olm-operator-6b444d44fb-65lkg\" (UID: \"1a2d7ab3-ae2e-4c16-a1c9-97997ded9506\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65lkg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337192 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/833b50c0-572f-4534-8a96-af514ff81953-stats-auth\") pod \"router-default-5444994796-55r8b\" (UID: \"833b50c0-572f-4534-8a96-af514ff81953\") " pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337213 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c52cdcb0-2915-493e-ab48-7c863e590ee2-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mq9k6\" (UID: \"c52cdcb0-2915-493e-ab48-7c863e590ee2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mq9k6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337233 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-encryption-config\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337251 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhdnh\" (UniqueName: \"kubernetes.io/projected/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-kube-api-access-nhdnh\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337269 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1a2d7ab3-ae2e-4c16-a1c9-97997ded9506-profile-collector-cert\") pod \"olm-operator-6b444d44fb-65lkg\" (UID: \"1a2d7ab3-ae2e-4c16-a1c9-97997ded9506\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65lkg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337286 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36782e8d-b271-46a5-8f96-8979022991f2-secret-volume\") pod \"collect-profiles-29414010-4vpsw\" (UID: \"36782e8d-b271-46a5-8f96-8979022991f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337303 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36782e8d-b271-46a5-8f96-8979022991f2-config-volume\") pod \"collect-profiles-29414010-4vpsw\" (UID: \"36782e8d-b271-46a5-8f96-8979022991f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337344 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7rwh\" (UniqueName: \"kubernetes.io/projected/1a2d7ab3-ae2e-4c16-a1c9-97997ded9506-kube-api-access-l7rwh\") pod \"olm-operator-6b444d44fb-65lkg\" (UID: \"1a2d7ab3-ae2e-4c16-a1c9-97997ded9506\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65lkg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337367 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp27r\" (UniqueName: \"kubernetes.io/projected/a85a548f-a204-4573-ae20-c62a17e17df3-kube-api-access-lp27r\") pod \"packageserver-d55dfcdfc-7jdxk\" (UID: \"a85a548f-a204-4573-ae20-c62a17e17df3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337388 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5fc4cb9-89af-47dd-b38a-2378c774a9c5-service-ca-bundle\") pod \"authentication-operator-69f744f599-kn9rb\" (UID: \"d5fc4cb9-89af-47dd-b38a-2378c774a9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337413 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79fm6\" (UniqueName: \"kubernetes.io/projected/8d1cb4d7-e1d9-4755-bee4-c571f8cffcba-kube-api-access-79fm6\") pod \"migrator-59844c95c7-qn6gc\" (UID: \"8d1cb4d7-e1d9-4755-bee4-c571f8cffcba\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qn6gc" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337431 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-node-pullsecrets\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337457 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-console-serving-cert\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337497 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78528eec-0bba-40f0-9739-0a4e951d53da-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vflmh\" (UID: \"78528eec-0bba-40f0-9739-0a4e951d53da\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vflmh" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337516 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8-proxy-tls\") pod \"machine-config-operator-74547568cd-wkns5\" (UID: \"e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337537 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39c1db76-c273-46ee-a00e-3dae4dc1ed6b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lfjld\" (UID: \"39c1db76-c273-46ee-a00e-3dae4dc1ed6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337558 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vg2f\" (UniqueName: \"kubernetes.io/projected/1c46aadf-ba22-4bdc-b76a-8b9ad8880368-kube-api-access-9vg2f\") pod \"openshift-config-operator-7777fb866f-zctpg\" (UID: \"1c46aadf-ba22-4bdc-b76a-8b9ad8880368\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zctpg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337580 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5fc4cb9-89af-47dd-b38a-2378c774a9c5-config\") pod \"authentication-operator-69f744f599-kn9rb\" (UID: \"d5fc4cb9-89af-47dd-b38a-2378c774a9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337597 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-service-ca\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337615 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-oauth-serving-cert\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337633 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8-images\") pod \"machine-config-operator-74547568cd-wkns5\" (UID: \"e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337651 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wkns5\" (UID: \"e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337668 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c52cdcb0-2915-493e-ab48-7c863e590ee2-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mq9k6\" (UID: \"c52cdcb0-2915-493e-ab48-7c863e590ee2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mq9k6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337686 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc5wk\" (UniqueName: \"kubernetes.io/projected/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-kube-api-access-fc5wk\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337708 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnk99\" (UniqueName: \"kubernetes.io/projected/debdfef3-4184-4a37-a818-e5c41c81e2fd-kube-api-access-nnk99\") pod \"multus-admission-controller-857f4d67dd-hzbkw\" (UID: \"debdfef3-4184-4a37-a818-e5c41c81e2fd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hzbkw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337730 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39c1db76-c273-46ee-a00e-3dae4dc1ed6b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lfjld\" (UID: \"39c1db76-c273-46ee-a00e-3dae4dc1ed6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337748 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/debdfef3-4184-4a37-a818-e5c41c81e2fd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hzbkw\" (UID: \"debdfef3-4184-4a37-a818-e5c41c81e2fd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hzbkw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337767 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ttw5\" (UniqueName: \"kubernetes.io/projected/39c1db76-c273-46ee-a00e-3dae4dc1ed6b-kube-api-access-6ttw5\") pod \"cluster-image-registry-operator-dc59b4c8b-lfjld\" (UID: \"39c1db76-c273-46ee-a00e-3dae4dc1ed6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337795 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39c1db76-c273-46ee-a00e-3dae4dc1ed6b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lfjld\" (UID: \"39c1db76-c273-46ee-a00e-3dae4dc1ed6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337812 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-audit\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337831 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/833b50c0-572f-4534-8a96-af514ff81953-service-ca-bundle\") pod \"router-default-5444994796-55r8b\" (UID: \"833b50c0-572f-4534-8a96-af514ff81953\") " pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337857 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78528eec-0bba-40f0-9739-0a4e951d53da-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vflmh\" (UID: \"78528eec-0bba-40f0-9739-0a4e951d53da\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vflmh" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337877 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a96a1772-5a62-4b57-bc83-4faa4dcb1260-config\") pod \"kube-apiserver-operator-766d6c64bb-zntf6\" (UID: \"a96a1772-5a62-4b57-bc83-4faa4dcb1260\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zntf6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337902 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337921 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57e20f16-dfe1-45b4-8b13-860011aac931-trusted-ca\") pod \"ingress-operator-5b745b69d9-hz7pw\" (UID: \"57e20f16-dfe1-45b4-8b13-860011aac931\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337946 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c46aadf-ba22-4bdc-b76a-8b9ad8880368-serving-cert\") pod \"openshift-config-operator-7777fb866f-zctpg\" (UID: \"1c46aadf-ba22-4bdc-b76a-8b9ad8880368\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zctpg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337966 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a85a548f-a204-4573-ae20-c62a17e17df3-apiservice-cert\") pod \"packageserver-d55dfcdfc-7jdxk\" (UID: \"a85a548f-a204-4573-ae20-c62a17e17df3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.337983 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-audit-dir\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.338002 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9v82\" (UniqueName: \"kubernetes.io/projected/57e20f16-dfe1-45b4-8b13-860011aac931-kube-api-access-s9v82\") pod \"ingress-operator-5b745b69d9-hz7pw\" (UID: \"57e20f16-dfe1-45b4-8b13-860011aac931\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.338021 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjpqc\" (UniqueName: \"kubernetes.io/projected/d5fc4cb9-89af-47dd-b38a-2378c774a9c5-kube-api-access-qjpqc\") pod \"authentication-operator-69f744f599-kn9rb\" (UID: \"d5fc4cb9-89af-47dd-b38a-2378c774a9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.338044 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a85a548f-a204-4573-ae20-c62a17e17df3-tmpfs\") pod \"packageserver-d55dfcdfc-7jdxk\" (UID: \"a85a548f-a204-4573-ae20-c62a17e17df3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.338069 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/833b50c0-572f-4534-8a96-af514ff81953-metrics-certs\") pod \"router-default-5444994796-55r8b\" (UID: \"833b50c0-572f-4534-8a96-af514ff81953\") " pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.338120 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a85a548f-a204-4573-ae20-c62a17e17df3-webhook-cert\") pod \"packageserver-d55dfcdfc-7jdxk\" (UID: \"a85a548f-a204-4573-ae20-c62a17e17df3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.338139 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-config\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.338176 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1c46aadf-ba22-4bdc-b76a-8b9ad8880368-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zctpg\" (UID: \"1c46aadf-ba22-4bdc-b76a-8b9ad8880368\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zctpg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.338195 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a0d11105-8b91-4aad-8c32-dfe2ef976028-srv-cert\") pod \"catalog-operator-68c6474976-pstsq\" (UID: \"a0d11105-8b91-4aad-8c32-dfe2ef976028\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.338212 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5fc4cb9-89af-47dd-b38a-2378c774a9c5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kn9rb\" (UID: \"d5fc4cb9-89af-47dd-b38a-2378c774a9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.338230 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-etcd-serving-ca\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.338247 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-trusted-ca-bundle\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.338266 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-console-config\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.338283 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-etcd-client\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.338301 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/833b50c0-572f-4534-8a96-af514ff81953-default-certificate\") pod \"router-default-5444994796-55r8b\" (UID: \"833b50c0-572f-4534-8a96-af514ff81953\") " pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.338349 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9dm8\" (UniqueName: \"kubernetes.io/projected/36782e8d-b271-46a5-8f96-8979022991f2-kube-api-access-g9dm8\") pod \"collect-profiles-29414010-4vpsw\" (UID: \"36782e8d-b271-46a5-8f96-8979022991f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.342013 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-console-oauth-config\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.342903 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-image-import-ca\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.346090 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5fc4cb9-89af-47dd-b38a-2378c774a9c5-serving-cert\") pod \"authentication-operator-69f744f599-kn9rb\" (UID: \"d5fc4cb9-89af-47dd-b38a-2378c774a9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.351441 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-node-pullsecrets\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.354300 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-encryption-config\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.368102 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a0d11105-8b91-4aad-8c32-dfe2ef976028-profile-collector-cert\") pod \"catalog-operator-68c6474976-pstsq\" (UID: \"a0d11105-8b91-4aad-8c32-dfe2ef976028\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.370478 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36782e8d-b271-46a5-8f96-8979022991f2-config-volume\") pod \"collect-profiles-29414010-4vpsw\" (UID: \"36782e8d-b271-46a5-8f96-8979022991f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.370803 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.371413 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a85a548f-a204-4573-ae20-c62a17e17df3-tmpfs\") pod \"packageserver-d55dfcdfc-7jdxk\" (UID: \"a85a548f-a204-4573-ae20-c62a17e17df3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.371835 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1a2d7ab3-ae2e-4c16-a1c9-97997ded9506-profile-collector-cert\") pod \"olm-operator-6b444d44fb-65lkg\" (UID: \"1a2d7ab3-ae2e-4c16-a1c9-97997ded9506\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65lkg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.371896 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39c1db76-c273-46ee-a00e-3dae4dc1ed6b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lfjld\" (UID: \"39c1db76-c273-46ee-a00e-3dae4dc1ed6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.372017 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36782e8d-b271-46a5-8f96-8979022991f2-secret-volume\") pod \"collect-profiles-29414010-4vpsw\" (UID: \"36782e8d-b271-46a5-8f96-8979022991f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.372476 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5fc4cb9-89af-47dd-b38a-2378c774a9c5-service-ca-bundle\") pod \"authentication-operator-69f744f599-kn9rb\" (UID: \"d5fc4cb9-89af-47dd-b38a-2378c774a9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.372653 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-oauth-serving-cert\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.372685 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39c1db76-c273-46ee-a00e-3dae4dc1ed6b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lfjld\" (UID: \"39c1db76-c273-46ee-a00e-3dae4dc1ed6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.372981 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-config\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.373241 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-audit\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.373554 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5fc4cb9-89af-47dd-b38a-2378c774a9c5-config\") pod \"authentication-operator-69f744f599-kn9rb\" (UID: \"d5fc4cb9-89af-47dd-b38a-2378c774a9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.374154 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-service-ca\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.374748 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-trusted-ca-bundle\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.375255 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-etcd-serving-ca\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.376002 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-console-config\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.377181 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.378239 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-serving-cert\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.378577 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1c46aadf-ba22-4bdc-b76a-8b9ad8880368-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zctpg\" (UID: \"1c46aadf-ba22-4bdc-b76a-8b9ad8880368\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zctpg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.379093 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-console-serving-cert\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.379624 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78528eec-0bba-40f0-9739-0a4e951d53da-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vflmh\" (UID: \"78528eec-0bba-40f0-9739-0a4e951d53da\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vflmh" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.379683 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-audit-dir\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.381845 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78528eec-0bba-40f0-9739-0a4e951d53da-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vflmh\" (UID: \"78528eec-0bba-40f0-9739-0a4e951d53da\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vflmh" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.383671 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.384283 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c46aadf-ba22-4bdc-b76a-8b9ad8880368-serving-cert\") pod \"openshift-config-operator-7777fb866f-zctpg\" (UID: \"1c46aadf-ba22-4bdc-b76a-8b9ad8880368\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zctpg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.384317 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-etcd-client\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.386046 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8-images\") pod \"machine-config-operator-74547568cd-wkns5\" (UID: \"e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.390457 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wkns5\" (UID: \"e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.390886 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5fc4cb9-89af-47dd-b38a-2378c774a9c5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kn9rb\" (UID: \"d5fc4cb9-89af-47dd-b38a-2378c774a9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.392630 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a85a548f-a204-4573-ae20-c62a17e17df3-webhook-cert\") pod \"packageserver-d55dfcdfc-7jdxk\" (UID: \"a85a548f-a204-4573-ae20-c62a17e17df3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.394631 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.395780 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a85a548f-a204-4573-ae20-c62a17e17df3-apiservice-cert\") pod \"packageserver-d55dfcdfc-7jdxk\" (UID: \"a85a548f-a204-4573-ae20-c62a17e17df3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.414104 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.425791 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8-proxy-tls\") pod \"machine-config-operator-74547568cd-wkns5\" (UID: \"e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.434194 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.444604 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a0d11105-8b91-4aad-8c32-dfe2ef976028-srv-cert\") pod \"catalog-operator-68c6474976-pstsq\" (UID: \"a0d11105-8b91-4aad-8c32-dfe2ef976028\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.454189 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.460548 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.460572 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.474730 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.495577 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.499295 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57e20f16-dfe1-45b4-8b13-860011aac931-metrics-tls\") pod \"ingress-operator-5b745b69d9-hz7pw\" (UID: \"57e20f16-dfe1-45b4-8b13-860011aac931\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.521295 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.524866 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57e20f16-dfe1-45b4-8b13-860011aac931-trusted-ca\") pod \"ingress-operator-5b745b69d9-hz7pw\" (UID: \"57e20f16-dfe1-45b4-8b13-860011aac931\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.534108 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.554804 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.574799 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.582389 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a96a1772-5a62-4b57-bc83-4faa4dcb1260-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zntf6\" (UID: \"a96a1772-5a62-4b57-bc83-4faa4dcb1260\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zntf6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.595296 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.615502 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.620926 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a96a1772-5a62-4b57-bc83-4faa4dcb1260-config\") pod \"kube-apiserver-operator-766d6c64bb-zntf6\" (UID: \"a96a1772-5a62-4b57-bc83-4faa4dcb1260\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zntf6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.634531 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.645350 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1a2d7ab3-ae2e-4c16-a1c9-97997ded9506-srv-cert\") pod \"olm-operator-6b444d44fb-65lkg\" (UID: \"1a2d7ab3-ae2e-4c16-a1c9-97997ded9506\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65lkg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.654819 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.674958 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.685454 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/833b50c0-572f-4534-8a96-af514ff81953-metrics-certs\") pod \"router-default-5444994796-55r8b\" (UID: \"833b50c0-572f-4534-8a96-af514ff81953\") " pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.694597 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.715122 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.724649 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/833b50c0-572f-4534-8a96-af514ff81953-stats-auth\") pod \"router-default-5444994796-55r8b\" (UID: \"833b50c0-572f-4534-8a96-af514ff81953\") " pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.734670 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.740991 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/833b50c0-572f-4534-8a96-af514ff81953-service-ca-bundle\") pod \"router-default-5444994796-55r8b\" (UID: \"833b50c0-572f-4534-8a96-af514ff81953\") " pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.754846 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.760803 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/833b50c0-572f-4534-8a96-af514ff81953-default-certificate\") pod \"router-default-5444994796-55r8b\" (UID: \"833b50c0-572f-4534-8a96-af514ff81953\") " pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.775413 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.794779 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.814493 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.834582 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.838676 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c52cdcb0-2915-493e-ab48-7c863e590ee2-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mq9k6\" (UID: \"c52cdcb0-2915-493e-ab48-7c863e590ee2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mq9k6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.854812 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.864045 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52cdcb0-2915-493e-ab48-7c863e590ee2-config\") pod \"kube-controller-manager-operator-78b949d7b-mq9k6\" (UID: \"c52cdcb0-2915-493e-ab48-7c863e590ee2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mq9k6" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.874802 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.894518 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.905092 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/debdfef3-4184-4a37-a818-e5c41c81e2fd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hzbkw\" (UID: \"debdfef3-4184-4a37-a818-e5c41c81e2fd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hzbkw" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.934722 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.954485 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 04 09:43:11 crc kubenswrapper[4693]: I1204 09:43:11.974380 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.001289 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.014102 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.034197 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.055763 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.074872 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.095272 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.114258 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.135206 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.154587 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.174530 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.193277 4693 request.go:700] Waited for 1.019542844s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/configmaps?fieldSelector=metadata.name%3Detcd-service-ca-bundle&limit=500&resourceVersion=0 Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.194773 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.215028 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.234804 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.255208 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.275025 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.295264 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.314213 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.334285 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.357475 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.375050 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.394425 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.414903 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.435221 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.460255 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kncc4" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.460300 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.460897 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.474070 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.494658 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.515047 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.534817 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.554404 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.574999 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.595295 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.615472 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.634958 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.654480 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.675038 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.694112 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.714642 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.735254 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.754538 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.775002 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.795473 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.814850 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.835174 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.854127 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.874644 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.895814 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.915184 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.935070 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.954485 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.974140 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 04 09:43:12 crc kubenswrapper[4693]: I1204 09:43:12.994893 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.031048 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9mrh\" (UniqueName: \"kubernetes.io/projected/ca1189f1-0dee-459a-bb4f-dfda69f2eee1-kube-api-access-g9mrh\") pod \"apiserver-7bbb656c7d-zkx54\" (UID: \"ca1189f1-0dee-459a-bb4f-dfda69f2eee1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.052085 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngdqf\" (UniqueName: \"kubernetes.io/projected/764c0924-2f3b-4341-9922-a22d2f3cf145-kube-api-access-ngdqf\") pod \"oauth-openshift-558db77b4-jznlz\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.073764 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq8n4\" (UniqueName: \"kubernetes.io/projected/46a329a4-a450-4e39-bcbe-c7dcba1e6939-kube-api-access-tq8n4\") pod \"machine-api-operator-5694c8668f-gh7dl\" (UID: \"46a329a4-a450-4e39-bcbe-c7dcba1e6939\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gh7dl" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.092217 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljqhr\" (UniqueName: \"kubernetes.io/projected/f30a48a3-da96-4844-8a96-3478db0a7018-kube-api-access-ljqhr\") pod \"openshift-apiserver-operator-796bbdcf4f-9x82p\" (UID: \"f30a48a3-da96-4844-8a96-3478db0a7018\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9x82p" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.123907 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjf4n\" (UniqueName: \"kubernetes.io/projected/6a14b590-276f-49be-961a-c459c975c8ab-kube-api-access-rjf4n\") pod \"cluster-samples-operator-665b6dd947-5fvns\" (UID: \"6a14b590-276f-49be-961a-c459c975c8ab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5fvns" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.137505 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czwj4\" (UniqueName: \"kubernetes.io/projected/57ffaa1b-fb6e-4c7b-8809-ca9bbd8f527e-kube-api-access-czwj4\") pod \"downloads-7954f5f757-sg997\" (UID: \"57ffaa1b-fb6e-4c7b-8809-ca9bbd8f527e\") " pod="openshift-console/downloads-7954f5f757-sg997" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.155975 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp6xh\" (UniqueName: \"kubernetes.io/projected/68219a06-b58a-4d36-b851-32dd1e4a2ec5-kube-api-access-dp6xh\") pod \"route-controller-manager-6576b87f9c-5dnpl\" (UID: \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.174698 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q98r9\" (UniqueName: \"kubernetes.io/projected/b3246be0-0c88-49c2-8cee-05c3661a509e-kube-api-access-q98r9\") pod \"console-operator-58897d9998-zjrqj\" (UID: \"b3246be0-0c88-49c2-8cee-05c3661a509e\") " pod="openshift-console-operator/console-operator-58897d9998-zjrqj" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.186706 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-gh7dl" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.194581 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.195520 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7g8k\" (UniqueName: \"kubernetes.io/projected/5c8b51b7-718d-43b5-9e18-58966747279f-kube-api-access-h7g8k\") pod \"machine-approver-56656f9798-kkwtj\" (UID: \"5c8b51b7-718d-43b5-9e18-58966747279f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kkwtj" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.212693 4693 request.go:700] Waited for 1.947308047s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/secrets?fieldSelector=metadata.name%3Dcsi-hostpath-provisioner-sa-dockercfg-qd74k&limit=500&resourceVersion=0 Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.215631 4693 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.224955 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9x82p" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.234940 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.236112 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.244122 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5fvns" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.253561 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.273166 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.280791 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9dm8\" (UniqueName: \"kubernetes.io/projected/36782e8d-b271-46a5-8f96-8979022991f2-kube-api-access-g9dm8\") pod \"collect-profiles-29414010-4vpsw\" (UID: \"36782e8d-b271-46a5-8f96-8979022991f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.281564 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-sg997" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.301602 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57e20f16-dfe1-45b4-8b13-860011aac931-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hz7pw\" (UID: \"57e20f16-dfe1-45b4-8b13-860011aac931\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.319906 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cxbb\" (UniqueName: \"kubernetes.io/projected/e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8-kube-api-access-5cxbb\") pod \"machine-config-operator-74547568cd-wkns5\" (UID: \"e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.336234 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgv6b\" (UniqueName: \"kubernetes.io/projected/a0d11105-8b91-4aad-8c32-dfe2ef976028-kube-api-access-jgv6b\") pod \"catalog-operator-68c6474976-pstsq\" (UID: \"a0d11105-8b91-4aad-8c32-dfe2ef976028\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.364043 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m47l5\" (UniqueName: \"kubernetes.io/projected/78528eec-0bba-40f0-9739-0a4e951d53da-kube-api-access-m47l5\") pod \"openshift-controller-manager-operator-756b6f6bc6-vflmh\" (UID: \"78528eec-0bba-40f0-9739-0a4e951d53da\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vflmh" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.376756 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a96a1772-5a62-4b57-bc83-4faa4dcb1260-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zntf6\" (UID: \"a96a1772-5a62-4b57-bc83-4faa4dcb1260\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zntf6" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.380590 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.394227 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnk99\" (UniqueName: \"kubernetes.io/projected/debdfef3-4184-4a37-a818-e5c41c81e2fd-kube-api-access-nnk99\") pod \"multus-admission-controller-857f4d67dd-hzbkw\" (UID: \"debdfef3-4184-4a37-a818-e5c41c81e2fd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hzbkw" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.409508 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.415859 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5x6q\" (UniqueName: \"kubernetes.io/projected/833b50c0-572f-4534-8a96-af514ff81953-kube-api-access-z5x6q\") pod \"router-default-5444994796-55r8b\" (UID: \"833b50c0-572f-4534-8a96-af514ff81953\") " pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.416085 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.417084 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kkwtj" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.430946 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c52cdcb0-2915-493e-ab48-7c863e590ee2-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mq9k6\" (UID: \"c52cdcb0-2915-493e-ab48-7c863e590ee2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mq9k6" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.460998 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zjrqj" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.462573 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhdnh\" (UniqueName: \"kubernetes.io/projected/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-kube-api-access-nhdnh\") pod \"console-f9d7485db-mhzjn\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.464296 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-gh7dl"] Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.469525 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7rwh\" (UniqueName: \"kubernetes.io/projected/1a2d7ab3-ae2e-4c16-a1c9-97997ded9506-kube-api-access-l7rwh\") pod \"olm-operator-6b444d44fb-65lkg\" (UID: \"1a2d7ab3-ae2e-4c16-a1c9-97997ded9506\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65lkg" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.487804 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zntf6" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.491451 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65lkg" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.498510 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9v82\" (UniqueName: \"kubernetes.io/projected/57e20f16-dfe1-45b4-8b13-860011aac931-kube-api-access-s9v82\") pod \"ingress-operator-5b745b69d9-hz7pw\" (UID: \"57e20f16-dfe1-45b4-8b13-860011aac931\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.498747 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.505455 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mq9k6" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.516804 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjpqc\" (UniqueName: \"kubernetes.io/projected/d5fc4cb9-89af-47dd-b38a-2378c774a9c5-kube-api-access-qjpqc\") pod \"authentication-operator-69f744f599-kn9rb\" (UID: \"d5fc4cb9-89af-47dd-b38a-2378c774a9c5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.540523 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ttw5\" (UniqueName: \"kubernetes.io/projected/39c1db76-c273-46ee-a00e-3dae4dc1ed6b-kube-api-access-6ttw5\") pod \"cluster-image-registry-operator-dc59b4c8b-lfjld\" (UID: \"39c1db76-c273-46ee-a00e-3dae4dc1ed6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.566672 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hzbkw" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.573046 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp27r\" (UniqueName: \"kubernetes.io/projected/a85a548f-a204-4573-ae20-c62a17e17df3-kube-api-access-lp27r\") pod \"packageserver-d55dfcdfc-7jdxk\" (UID: \"a85a548f-a204-4573-ae20-c62a17e17df3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.575886 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39c1db76-c273-46ee-a00e-3dae4dc1ed6b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lfjld\" (UID: \"39c1db76-c273-46ee-a00e-3dae4dc1ed6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.589820 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.593827 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vg2f\" (UniqueName: \"kubernetes.io/projected/1c46aadf-ba22-4bdc-b76a-8b9ad8880368-kube-api-access-9vg2f\") pod \"openshift-config-operator-7777fb866f-zctpg\" (UID: \"1c46aadf-ba22-4bdc-b76a-8b9ad8880368\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zctpg" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.599672 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.611744 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.612866 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79fm6\" (UniqueName: \"kubernetes.io/projected/8d1cb4d7-e1d9-4755-bee4-c571f8cffcba-kube-api-access-79fm6\") pod \"migrator-59844c95c7-qn6gc\" (UID: \"8d1cb4d7-e1d9-4755-bee4-c571f8cffcba\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qn6gc" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.666263 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zctpg" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.666613 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vflmh" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.668262 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.669078 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc5wk\" (UniqueName: \"kubernetes.io/projected/c7e74be3-f8f4-4f94-8f11-657cb2c75ceb-kube-api-access-fc5wk\") pod \"apiserver-76f77b778f-tzx57\" (UID: \"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb\") " pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.678875 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.695437 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qn6gc" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.702553 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.715835 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.721259 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl"] Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.726716 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9x82p"] Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.726907 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.735067 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.754654 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.771601 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3ce889a4-48b5-429d-8d0e-fc270a53385b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.771649 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3ce889a4-48b5-429d-8d0e-fc270a53385b-registry-certificates\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.771671 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ce889a4-48b5-429d-8d0e-fc270a53385b-trusted-ca\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.771691 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec65ec10-8b3d-4f00-8662-ee8ee7cbd533-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7jlpl\" (UID: \"ec65ec10-8b3d-4f00-8662-ee8ee7cbd533\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7jlpl" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.771710 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59sj2\" (UniqueName: \"kubernetes.io/projected/3ce889a4-48b5-429d-8d0e-fc270a53385b-kube-api-access-59sj2\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.771731 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.771756 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec65ec10-8b3d-4f00-8662-ee8ee7cbd533-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7jlpl\" (UID: \"ec65ec10-8b3d-4f00-8662-ee8ee7cbd533\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7jlpl" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.771791 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5j2g\" (UniqueName: \"kubernetes.io/projected/ec65ec10-8b3d-4f00-8662-ee8ee7cbd533-kube-api-access-s5j2g\") pod \"kube-storage-version-migrator-operator-b67b599dd-7jlpl\" (UID: \"ec65ec10-8b3d-4f00-8662-ee8ee7cbd533\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7jlpl" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.771840 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ce889a4-48b5-429d-8d0e-fc270a53385b-registry-tls\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.771878 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3ce889a4-48b5-429d-8d0e-fc270a53385b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.771916 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ce889a4-48b5-429d-8d0e-fc270a53385b-bound-sa-token\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:13 crc kubenswrapper[4693]: E1204 09:43:13.772514 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:14.272494669 +0000 UTC m=+40.170088492 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.775550 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 04 09:43:13 crc kubenswrapper[4693]: W1204 09:43:13.785560 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68219a06_b58a_4d36_b851_32dd1e4a2ec5.slice/crio-8e21fa0f0d1ae5a6f363f2557d09422747dff2e715af252b826e3e78292a54a1 WatchSource:0}: Error finding container 8e21fa0f0d1ae5a6f363f2557d09422747dff2e715af252b826e3e78292a54a1: Status 404 returned error can't find the container with id 8e21fa0f0d1ae5a6f363f2557d09422747dff2e715af252b826e3e78292a54a1 Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.875482 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.875650 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d293772-95ef-4025-ab46-0b150e902a0f-config\") pod \"etcd-operator-b45778765-zkq2l\" (UID: \"4d293772-95ef-4025-ab46-0b150e902a0f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.875693 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bccb2393-5218-4cd0-9b7e-c9d19eab391b-socket-dir\") pod \"csi-hostpathplugin-5vvnq\" (UID: \"bccb2393-5218-4cd0-9b7e-c9d19eab391b\") " pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.875709 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4d293772-95ef-4025-ab46-0b150e902a0f-etcd-service-ca\") pod \"etcd-operator-b45778765-zkq2l\" (UID: \"4d293772-95ef-4025-ab46-0b150e902a0f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.875729 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/02e5f7a9-a93a-4e64-83ee-ab23c12ef67f-signing-cabundle\") pod \"service-ca-9c57cc56f-lxk9g\" (UID: \"02e5f7a9-a93a-4e64-83ee-ab23c12ef67f\") " pod="openshift-service-ca/service-ca-9c57cc56f-lxk9g" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.875789 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5qz9\" (UniqueName: \"kubernetes.io/projected/bccb2393-5218-4cd0-9b7e-c9d19eab391b-kube-api-access-l5qz9\") pod \"csi-hostpathplugin-5vvnq\" (UID: \"bccb2393-5218-4cd0-9b7e-c9d19eab391b\") " pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.875808 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d90f3655-18d4-4dec-b9d4-7309fa424c4e-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-74srj\" (UID: \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.875865 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e40c792e-b179-402f-82d0-9744288c0680-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gc6fl\" (UID: \"e40c792e-b179-402f-82d0-9744288c0680\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gc6fl" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.875896 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8xrf\" (UniqueName: \"kubernetes.io/projected/7180535e-ac7a-4999-a926-3a6ffe02852c-kube-api-access-f8xrf\") pod \"machine-config-server-94np6\" (UID: \"7180535e-ac7a-4999-a926-3a6ffe02852c\") " pod="openshift-machine-config-operator/machine-config-server-94np6" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.875914 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bccb2393-5218-4cd0-9b7e-c9d19eab391b-registration-dir\") pod \"csi-hostpathplugin-5vvnq\" (UID: \"bccb2393-5218-4cd0-9b7e-c9d19eab391b\") " pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.875934 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3ce889a4-48b5-429d-8d0e-fc270a53385b-registry-certificates\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.875949 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3ce889a4-48b5-429d-8d0e-fc270a53385b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.875968 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e9b6a05d-0104-46ac-aa11-2e13f0369b1f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-g72br\" (UID: \"e9b6a05d-0104-46ac-aa11-2e13f0369b1f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g72br" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.876060 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ce889a4-48b5-429d-8d0e-fc270a53385b-trusted-ca\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.876097 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43747056-18ae-4153-9d16-9d4f4330ddb3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75b6q\" (UID: \"43747056-18ae-4153-9d16-9d4f4330ddb3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75b6q" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.876121 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f800f28a-197d-4e69-b66d-3840065b674e-config-volume\") pod \"dns-default-wjxqk\" (UID: \"f800f28a-197d-4e69-b66d-3840065b674e\") " pod="openshift-dns/dns-default-wjxqk" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.876161 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js2kz\" (UniqueName: \"kubernetes.io/projected/02e5f7a9-a93a-4e64-83ee-ab23c12ef67f-kube-api-access-js2kz\") pod \"service-ca-9c57cc56f-lxk9g\" (UID: \"02e5f7a9-a93a-4e64-83ee-ab23c12ef67f\") " pod="openshift-service-ca/service-ca-9c57cc56f-lxk9g" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.876196 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec65ec10-8b3d-4f00-8662-ee8ee7cbd533-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7jlpl\" (UID: \"ec65ec10-8b3d-4f00-8662-ee8ee7cbd533\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7jlpl" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.876790 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/390724a0-ca5c-4309-93a5-13aa44b32831-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-l5tpz\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.876900 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2x75\" (UniqueName: \"kubernetes.io/projected/f800f28a-197d-4e69-b66d-3840065b674e-kube-api-access-m2x75\") pod \"dns-default-wjxqk\" (UID: \"f800f28a-197d-4e69-b66d-3840065b674e\") " pod="openshift-dns/dns-default-wjxqk" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.876953 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59sj2\" (UniqueName: \"kubernetes.io/projected/3ce889a4-48b5-429d-8d0e-fc270a53385b-kube-api-access-59sj2\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877044 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpr68\" (UniqueName: \"kubernetes.io/projected/d90f3655-18d4-4dec-b9d4-7309fa424c4e-kube-api-access-cpr68\") pod \"cni-sysctl-allowlist-ds-74srj\" (UID: \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877107 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/081b2ab6-1edc-4ec5-9054-367d9f5951af-cert\") pod \"ingress-canary-z9fxv\" (UID: \"081b2ab6-1edc-4ec5-9054-367d9f5951af\") " pod="openshift-ingress-canary/ingress-canary-z9fxv" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877127 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f800f28a-197d-4e69-b66d-3840065b674e-metrics-tls\") pod \"dns-default-wjxqk\" (UID: \"f800f28a-197d-4e69-b66d-3840065b674e\") " pod="openshift-dns/dns-default-wjxqk" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877150 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/390724a0-ca5c-4309-93a5-13aa44b32831-client-ca\") pod \"controller-manager-879f6c89f-l5tpz\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877176 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec65ec10-8b3d-4f00-8662-ee8ee7cbd533-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7jlpl\" (UID: \"ec65ec10-8b3d-4f00-8662-ee8ee7cbd533\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7jlpl" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877194 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e40c792e-b179-402f-82d0-9744288c0680-proxy-tls\") pod \"machine-config-controller-84d6567774-gc6fl\" (UID: \"e40c792e-b179-402f-82d0-9744288c0680\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gc6fl" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877245 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5j2g\" (UniqueName: \"kubernetes.io/projected/ec65ec10-8b3d-4f00-8662-ee8ee7cbd533-kube-api-access-s5j2g\") pod \"kube-storage-version-migrator-operator-b67b599dd-7jlpl\" (UID: \"ec65ec10-8b3d-4f00-8662-ee8ee7cbd533\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7jlpl" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877261 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/02e5f7a9-a93a-4e64-83ee-ab23c12ef67f-signing-key\") pod \"service-ca-9c57cc56f-lxk9g\" (UID: \"02e5f7a9-a93a-4e64-83ee-ab23c12ef67f\") " pod="openshift-service-ca/service-ca-9c57cc56f-lxk9g" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877287 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm66h\" (UniqueName: \"kubernetes.io/projected/29687875-23eb-403d-a89f-eb4d32092d7e-kube-api-access-rm66h\") pod \"marketplace-operator-79b997595-jt5h7\" (UID: \"29687875-23eb-403d-a89f-eb4d32092d7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877309 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/390724a0-ca5c-4309-93a5-13aa44b32831-config\") pod \"controller-manager-879f6c89f-l5tpz\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877380 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bccb2393-5218-4cd0-9b7e-c9d19eab391b-mountpoint-dir\") pod \"csi-hostpathplugin-5vvnq\" (UID: \"bccb2393-5218-4cd0-9b7e-c9d19eab391b\") " pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877410 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54llr\" (UniqueName: \"kubernetes.io/projected/3e92c71e-1bcc-455f-a270-1dd051662af6-kube-api-access-54llr\") pod \"control-plane-machine-set-operator-78cbb6b69f-n9k5f\" (UID: \"3e92c71e-1bcc-455f-a270-1dd051662af6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n9k5f" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877427 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e190b12e-4fed-4d2d-9e8d-3a8b30c60175-serving-cert\") pod \"service-ca-operator-777779d784-rhtg6\" (UID: \"e190b12e-4fed-4d2d-9e8d-3a8b30c60175\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rhtg6" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877443 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hndvd\" (UniqueName: \"kubernetes.io/projected/955f4b1c-a9cc-42e0-bbbb-456dde994dcf-kube-api-access-hndvd\") pod \"dns-operator-744455d44c-6jrqg\" (UID: \"955f4b1c-a9cc-42e0-bbbb-456dde994dcf\") " pod="openshift-dns-operator/dns-operator-744455d44c-6jrqg" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877458 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43747056-18ae-4153-9d16-9d4f4330ddb3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75b6q\" (UID: \"43747056-18ae-4153-9d16-9d4f4330ddb3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75b6q" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877504 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/955f4b1c-a9cc-42e0-bbbb-456dde994dcf-metrics-tls\") pod \"dns-operator-744455d44c-6jrqg\" (UID: \"955f4b1c-a9cc-42e0-bbbb-456dde994dcf\") " pod="openshift-dns-operator/dns-operator-744455d44c-6jrqg" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877521 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qzkp\" (UniqueName: \"kubernetes.io/projected/4d293772-95ef-4025-ab46-0b150e902a0f-kube-api-access-9qzkp\") pod \"etcd-operator-b45778765-zkq2l\" (UID: \"4d293772-95ef-4025-ab46-0b150e902a0f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877590 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4d293772-95ef-4025-ab46-0b150e902a0f-etcd-ca\") pod \"etcd-operator-b45778765-zkq2l\" (UID: \"4d293772-95ef-4025-ab46-0b150e902a0f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877607 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e190b12e-4fed-4d2d-9e8d-3a8b30c60175-config\") pod \"service-ca-operator-777779d784-rhtg6\" (UID: \"e190b12e-4fed-4d2d-9e8d-3a8b30c60175\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rhtg6" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877634 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc67k\" (UniqueName: \"kubernetes.io/projected/390724a0-ca5c-4309-93a5-13aa44b32831-kube-api-access-gc67k\") pod \"controller-manager-879f6c89f-l5tpz\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877651 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gstcd\" (UniqueName: \"kubernetes.io/projected/e40c792e-b179-402f-82d0-9744288c0680-kube-api-access-gstcd\") pod \"machine-config-controller-84d6567774-gc6fl\" (UID: \"e40c792e-b179-402f-82d0-9744288c0680\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gc6fl" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877678 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7180535e-ac7a-4999-a926-3a6ffe02852c-node-bootstrap-token\") pod \"machine-config-server-94np6\" (UID: \"7180535e-ac7a-4999-a926-3a6ffe02852c\") " pod="openshift-machine-config-operator/machine-config-server-94np6" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877696 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp5kb\" (UniqueName: \"kubernetes.io/projected/e9b6a05d-0104-46ac-aa11-2e13f0369b1f-kube-api-access-jp5kb\") pod \"package-server-manager-789f6589d5-g72br\" (UID: \"e9b6a05d-0104-46ac-aa11-2e13f0369b1f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g72br" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877712 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d293772-95ef-4025-ab46-0b150e902a0f-serving-cert\") pod \"etcd-operator-b45778765-zkq2l\" (UID: \"4d293772-95ef-4025-ab46-0b150e902a0f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877739 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/29687875-23eb-403d-a89f-eb4d32092d7e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jt5h7\" (UID: \"29687875-23eb-403d-a89f-eb4d32092d7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877761 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ce889a4-48b5-429d-8d0e-fc270a53385b-registry-tls\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877775 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/d90f3655-18d4-4dec-b9d4-7309fa424c4e-ready\") pod \"cni-sysctl-allowlist-ds-74srj\" (UID: \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877826 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfvdf\" (UniqueName: \"kubernetes.io/projected/081b2ab6-1edc-4ec5-9054-367d9f5951af-kube-api-access-pfvdf\") pod \"ingress-canary-z9fxv\" (UID: \"081b2ab6-1edc-4ec5-9054-367d9f5951af\") " pod="openshift-ingress-canary/ingress-canary-z9fxv" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877844 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4d293772-95ef-4025-ab46-0b150e902a0f-etcd-client\") pod \"etcd-operator-b45778765-zkq2l\" (UID: \"4d293772-95ef-4025-ab46-0b150e902a0f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877863 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e92c71e-1bcc-455f-a270-1dd051662af6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-n9k5f\" (UID: \"3e92c71e-1bcc-455f-a270-1dd051662af6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n9k5f" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877899 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d90f3655-18d4-4dec-b9d4-7309fa424c4e-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-74srj\" (UID: \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.877943 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7180535e-ac7a-4999-a926-3a6ffe02852c-certs\") pod \"machine-config-server-94np6\" (UID: \"7180535e-ac7a-4999-a926-3a6ffe02852c\") " pod="openshift-machine-config-operator/machine-config-server-94np6" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.878027 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3ce889a4-48b5-429d-8d0e-fc270a53385b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:13 crc kubenswrapper[4693]: E1204 09:43:13.893979 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:14.393930289 +0000 UTC m=+40.291524052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.896583 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bccb2393-5218-4cd0-9b7e-c9d19eab391b-csi-data-dir\") pod \"csi-hostpathplugin-5vvnq\" (UID: \"bccb2393-5218-4cd0-9b7e-c9d19eab391b\") " pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.896676 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ce889a4-48b5-429d-8d0e-fc270a53385b-bound-sa-token\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.896703 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bccb2393-5218-4cd0-9b7e-c9d19eab391b-plugins-dir\") pod \"csi-hostpathplugin-5vvnq\" (UID: \"bccb2393-5218-4cd0-9b7e-c9d19eab391b\") " pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.896727 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlcff\" (UniqueName: \"kubernetes.io/projected/e190b12e-4fed-4d2d-9e8d-3a8b30c60175-kube-api-access-wlcff\") pod \"service-ca-operator-777779d784-rhtg6\" (UID: \"e190b12e-4fed-4d2d-9e8d-3a8b30c60175\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rhtg6" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.897293 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43747056-18ae-4153-9d16-9d4f4330ddb3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75b6q\" (UID: \"43747056-18ae-4153-9d16-9d4f4330ddb3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75b6q" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.897420 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29687875-23eb-403d-a89f-eb4d32092d7e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jt5h7\" (UID: \"29687875-23eb-403d-a89f-eb4d32092d7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.897449 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/390724a0-ca5c-4309-93a5-13aa44b32831-serving-cert\") pod \"controller-manager-879f6c89f-l5tpz\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.908844 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.977485 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5j2g\" (UniqueName: \"kubernetes.io/projected/ec65ec10-8b3d-4f00-8662-ee8ee7cbd533-kube-api-access-s5j2g\") pod \"kube-storage-version-migrator-operator-b67b599dd-7jlpl\" (UID: \"ec65ec10-8b3d-4f00-8662-ee8ee7cbd533\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7jlpl" Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.977702 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jznlz"] Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.981447 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5fvns"] Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.983055 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5"] Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.984650 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-sg997"] Dec 04 09:43:13 crc kubenswrapper[4693]: I1204 09:43:13.986235 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54"] Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000341 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f800f28a-197d-4e69-b66d-3840065b674e-config-volume\") pod \"dns-default-wjxqk\" (UID: \"f800f28a-197d-4e69-b66d-3840065b674e\") " pod="openshift-dns/dns-default-wjxqk" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000372 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js2kz\" (UniqueName: \"kubernetes.io/projected/02e5f7a9-a93a-4e64-83ee-ab23c12ef67f-kube-api-access-js2kz\") pod \"service-ca-9c57cc56f-lxk9g\" (UID: \"02e5f7a9-a93a-4e64-83ee-ab23c12ef67f\") " pod="openshift-service-ca/service-ca-9c57cc56f-lxk9g" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000401 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/390724a0-ca5c-4309-93a5-13aa44b32831-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-l5tpz\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000421 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2x75\" (UniqueName: \"kubernetes.io/projected/f800f28a-197d-4e69-b66d-3840065b674e-kube-api-access-m2x75\") pod \"dns-default-wjxqk\" (UID: \"f800f28a-197d-4e69-b66d-3840065b674e\") " pod="openshift-dns/dns-default-wjxqk" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000452 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000470 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpr68\" (UniqueName: \"kubernetes.io/projected/d90f3655-18d4-4dec-b9d4-7309fa424c4e-kube-api-access-cpr68\") pod \"cni-sysctl-allowlist-ds-74srj\" (UID: \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000486 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/081b2ab6-1edc-4ec5-9054-367d9f5951af-cert\") pod \"ingress-canary-z9fxv\" (UID: \"081b2ab6-1edc-4ec5-9054-367d9f5951af\") " pod="openshift-ingress-canary/ingress-canary-z9fxv" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000504 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f800f28a-197d-4e69-b66d-3840065b674e-metrics-tls\") pod \"dns-default-wjxqk\" (UID: \"f800f28a-197d-4e69-b66d-3840065b674e\") " pod="openshift-dns/dns-default-wjxqk" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000519 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/390724a0-ca5c-4309-93a5-13aa44b32831-client-ca\") pod \"controller-manager-879f6c89f-l5tpz\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000540 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e40c792e-b179-402f-82d0-9744288c0680-proxy-tls\") pod \"machine-config-controller-84d6567774-gc6fl\" (UID: \"e40c792e-b179-402f-82d0-9744288c0680\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gc6fl" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000555 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/02e5f7a9-a93a-4e64-83ee-ab23c12ef67f-signing-key\") pod \"service-ca-9c57cc56f-lxk9g\" (UID: \"02e5f7a9-a93a-4e64-83ee-ab23c12ef67f\") " pod="openshift-service-ca/service-ca-9c57cc56f-lxk9g" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000571 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm66h\" (UniqueName: \"kubernetes.io/projected/29687875-23eb-403d-a89f-eb4d32092d7e-kube-api-access-rm66h\") pod \"marketplace-operator-79b997595-jt5h7\" (UID: \"29687875-23eb-403d-a89f-eb4d32092d7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000586 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/390724a0-ca5c-4309-93a5-13aa44b32831-config\") pod \"controller-manager-879f6c89f-l5tpz\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000602 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bccb2393-5218-4cd0-9b7e-c9d19eab391b-mountpoint-dir\") pod \"csi-hostpathplugin-5vvnq\" (UID: \"bccb2393-5218-4cd0-9b7e-c9d19eab391b\") " pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000620 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54llr\" (UniqueName: \"kubernetes.io/projected/3e92c71e-1bcc-455f-a270-1dd051662af6-kube-api-access-54llr\") pod \"control-plane-machine-set-operator-78cbb6b69f-n9k5f\" (UID: \"3e92c71e-1bcc-455f-a270-1dd051662af6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n9k5f" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000638 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e190b12e-4fed-4d2d-9e8d-3a8b30c60175-serving-cert\") pod \"service-ca-operator-777779d784-rhtg6\" (UID: \"e190b12e-4fed-4d2d-9e8d-3a8b30c60175\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rhtg6" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000656 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hndvd\" (UniqueName: \"kubernetes.io/projected/955f4b1c-a9cc-42e0-bbbb-456dde994dcf-kube-api-access-hndvd\") pod \"dns-operator-744455d44c-6jrqg\" (UID: \"955f4b1c-a9cc-42e0-bbbb-456dde994dcf\") " pod="openshift-dns-operator/dns-operator-744455d44c-6jrqg" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000673 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43747056-18ae-4153-9d16-9d4f4330ddb3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75b6q\" (UID: \"43747056-18ae-4153-9d16-9d4f4330ddb3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75b6q" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000691 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/955f4b1c-a9cc-42e0-bbbb-456dde994dcf-metrics-tls\") pod \"dns-operator-744455d44c-6jrqg\" (UID: \"955f4b1c-a9cc-42e0-bbbb-456dde994dcf\") " pod="openshift-dns-operator/dns-operator-744455d44c-6jrqg" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000707 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qzkp\" (UniqueName: \"kubernetes.io/projected/4d293772-95ef-4025-ab46-0b150e902a0f-kube-api-access-9qzkp\") pod \"etcd-operator-b45778765-zkq2l\" (UID: \"4d293772-95ef-4025-ab46-0b150e902a0f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000733 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4d293772-95ef-4025-ab46-0b150e902a0f-etcd-ca\") pod \"etcd-operator-b45778765-zkq2l\" (UID: \"4d293772-95ef-4025-ab46-0b150e902a0f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000747 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e190b12e-4fed-4d2d-9e8d-3a8b30c60175-config\") pod \"service-ca-operator-777779d784-rhtg6\" (UID: \"e190b12e-4fed-4d2d-9e8d-3a8b30c60175\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rhtg6" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000770 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc67k\" (UniqueName: \"kubernetes.io/projected/390724a0-ca5c-4309-93a5-13aa44b32831-kube-api-access-gc67k\") pod \"controller-manager-879f6c89f-l5tpz\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000789 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gstcd\" (UniqueName: \"kubernetes.io/projected/e40c792e-b179-402f-82d0-9744288c0680-kube-api-access-gstcd\") pod \"machine-config-controller-84d6567774-gc6fl\" (UID: \"e40c792e-b179-402f-82d0-9744288c0680\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gc6fl" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000809 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7180535e-ac7a-4999-a926-3a6ffe02852c-node-bootstrap-token\") pod \"machine-config-server-94np6\" (UID: \"7180535e-ac7a-4999-a926-3a6ffe02852c\") " pod="openshift-machine-config-operator/machine-config-server-94np6" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000826 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp5kb\" (UniqueName: \"kubernetes.io/projected/e9b6a05d-0104-46ac-aa11-2e13f0369b1f-kube-api-access-jp5kb\") pod \"package-server-manager-789f6589d5-g72br\" (UID: \"e9b6a05d-0104-46ac-aa11-2e13f0369b1f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g72br" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000842 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d293772-95ef-4025-ab46-0b150e902a0f-serving-cert\") pod \"etcd-operator-b45778765-zkq2l\" (UID: \"4d293772-95ef-4025-ab46-0b150e902a0f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000860 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/29687875-23eb-403d-a89f-eb4d32092d7e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jt5h7\" (UID: \"29687875-23eb-403d-a89f-eb4d32092d7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000919 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/d90f3655-18d4-4dec-b9d4-7309fa424c4e-ready\") pod \"cni-sysctl-allowlist-ds-74srj\" (UID: \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000939 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfvdf\" (UniqueName: \"kubernetes.io/projected/081b2ab6-1edc-4ec5-9054-367d9f5951af-kube-api-access-pfvdf\") pod \"ingress-canary-z9fxv\" (UID: \"081b2ab6-1edc-4ec5-9054-367d9f5951af\") " pod="openshift-ingress-canary/ingress-canary-z9fxv" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000994 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4d293772-95ef-4025-ab46-0b150e902a0f-etcd-client\") pod \"etcd-operator-b45778765-zkq2l\" (UID: \"4d293772-95ef-4025-ab46-0b150e902a0f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.000998 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f800f28a-197d-4e69-b66d-3840065b674e-config-volume\") pod \"dns-default-wjxqk\" (UID: \"f800f28a-197d-4e69-b66d-3840065b674e\") " pod="openshift-dns/dns-default-wjxqk" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001014 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e92c71e-1bcc-455f-a270-1dd051662af6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-n9k5f\" (UID: \"3e92c71e-1bcc-455f-a270-1dd051662af6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n9k5f" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001058 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d90f3655-18d4-4dec-b9d4-7309fa424c4e-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-74srj\" (UID: \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001084 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7180535e-ac7a-4999-a926-3a6ffe02852c-certs\") pod \"machine-config-server-94np6\" (UID: \"7180535e-ac7a-4999-a926-3a6ffe02852c\") " pod="openshift-machine-config-operator/machine-config-server-94np6" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001116 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bccb2393-5218-4cd0-9b7e-c9d19eab391b-csi-data-dir\") pod \"csi-hostpathplugin-5vvnq\" (UID: \"bccb2393-5218-4cd0-9b7e-c9d19eab391b\") " pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001139 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bccb2393-5218-4cd0-9b7e-c9d19eab391b-plugins-dir\") pod \"csi-hostpathplugin-5vvnq\" (UID: \"bccb2393-5218-4cd0-9b7e-c9d19eab391b\") " pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001158 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlcff\" (UniqueName: \"kubernetes.io/projected/e190b12e-4fed-4d2d-9e8d-3a8b30c60175-kube-api-access-wlcff\") pod \"service-ca-operator-777779d784-rhtg6\" (UID: \"e190b12e-4fed-4d2d-9e8d-3a8b30c60175\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rhtg6" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001184 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43747056-18ae-4153-9d16-9d4f4330ddb3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75b6q\" (UID: \"43747056-18ae-4153-9d16-9d4f4330ddb3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75b6q" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001204 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29687875-23eb-403d-a89f-eb4d32092d7e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jt5h7\" (UID: \"29687875-23eb-403d-a89f-eb4d32092d7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001220 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/390724a0-ca5c-4309-93a5-13aa44b32831-serving-cert\") pod \"controller-manager-879f6c89f-l5tpz\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001251 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d293772-95ef-4025-ab46-0b150e902a0f-config\") pod \"etcd-operator-b45778765-zkq2l\" (UID: \"4d293772-95ef-4025-ab46-0b150e902a0f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001269 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bccb2393-5218-4cd0-9b7e-c9d19eab391b-socket-dir\") pod \"csi-hostpathplugin-5vvnq\" (UID: \"bccb2393-5218-4cd0-9b7e-c9d19eab391b\") " pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001283 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4d293772-95ef-4025-ab46-0b150e902a0f-etcd-service-ca\") pod \"etcd-operator-b45778765-zkq2l\" (UID: \"4d293772-95ef-4025-ab46-0b150e902a0f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001301 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/02e5f7a9-a93a-4e64-83ee-ab23c12ef67f-signing-cabundle\") pod \"service-ca-9c57cc56f-lxk9g\" (UID: \"02e5f7a9-a93a-4e64-83ee-ab23c12ef67f\") " pod="openshift-service-ca/service-ca-9c57cc56f-lxk9g" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001340 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5qz9\" (UniqueName: \"kubernetes.io/projected/bccb2393-5218-4cd0-9b7e-c9d19eab391b-kube-api-access-l5qz9\") pod \"csi-hostpathplugin-5vvnq\" (UID: \"bccb2393-5218-4cd0-9b7e-c9d19eab391b\") " pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001365 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d90f3655-18d4-4dec-b9d4-7309fa424c4e-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-74srj\" (UID: \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001405 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e40c792e-b179-402f-82d0-9744288c0680-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gc6fl\" (UID: \"e40c792e-b179-402f-82d0-9744288c0680\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gc6fl" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001422 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8xrf\" (UniqueName: \"kubernetes.io/projected/7180535e-ac7a-4999-a926-3a6ffe02852c-kube-api-access-f8xrf\") pod \"machine-config-server-94np6\" (UID: \"7180535e-ac7a-4999-a926-3a6ffe02852c\") " pod="openshift-machine-config-operator/machine-config-server-94np6" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001438 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bccb2393-5218-4cd0-9b7e-c9d19eab391b-registration-dir\") pod \"csi-hostpathplugin-5vvnq\" (UID: \"bccb2393-5218-4cd0-9b7e-c9d19eab391b\") " pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001463 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e9b6a05d-0104-46ac-aa11-2e13f0369b1f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-g72br\" (UID: \"e9b6a05d-0104-46ac-aa11-2e13f0369b1f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g72br" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001494 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43747056-18ae-4153-9d16-9d4f4330ddb3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75b6q\" (UID: \"43747056-18ae-4153-9d16-9d4f4330ddb3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75b6q" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001507 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/390724a0-ca5c-4309-93a5-13aa44b32831-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-l5tpz\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.001749 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/390724a0-ca5c-4309-93a5-13aa44b32831-client-ca\") pod \"controller-manager-879f6c89f-l5tpz\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:43:14 crc kubenswrapper[4693]: E1204 09:43:14.002080 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:14.502069089 +0000 UTC m=+40.399662842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.002704 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bccb2393-5218-4cd0-9b7e-c9d19eab391b-plugins-dir\") pod \"csi-hostpathplugin-5vvnq\" (UID: \"bccb2393-5218-4cd0-9b7e-c9d19eab391b\") " pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.003372 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43747056-18ae-4153-9d16-9d4f4330ddb3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75b6q\" (UID: \"43747056-18ae-4153-9d16-9d4f4330ddb3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75b6q" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.003487 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d293772-95ef-4025-ab46-0b150e902a0f-config\") pod \"etcd-operator-b45778765-zkq2l\" (UID: \"4d293772-95ef-4025-ab46-0b150e902a0f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.003829 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d90f3655-18d4-4dec-b9d4-7309fa424c4e-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-74srj\" (UID: \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.004836 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29687875-23eb-403d-a89f-eb4d32092d7e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jt5h7\" (UID: \"29687875-23eb-403d-a89f-eb4d32092d7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.006749 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bccb2393-5218-4cd0-9b7e-c9d19eab391b-mountpoint-dir\") pod \"csi-hostpathplugin-5vvnq\" (UID: \"bccb2393-5218-4cd0-9b7e-c9d19eab391b\") " pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.006796 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bccb2393-5218-4cd0-9b7e-c9d19eab391b-csi-data-dir\") pod \"csi-hostpathplugin-5vvnq\" (UID: \"bccb2393-5218-4cd0-9b7e-c9d19eab391b\") " pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.006828 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d90f3655-18d4-4dec-b9d4-7309fa424c4e-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-74srj\" (UID: \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.007312 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/02e5f7a9-a93a-4e64-83ee-ab23c12ef67f-signing-key\") pod \"service-ca-9c57cc56f-lxk9g\" (UID: \"02e5f7a9-a93a-4e64-83ee-ab23c12ef67f\") " pod="openshift-service-ca/service-ca-9c57cc56f-lxk9g" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.007412 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bccb2393-5218-4cd0-9b7e-c9d19eab391b-registration-dir\") pod \"csi-hostpathplugin-5vvnq\" (UID: \"bccb2393-5218-4cd0-9b7e-c9d19eab391b\") " pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.007568 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/390724a0-ca5c-4309-93a5-13aa44b32831-config\") pod \"controller-manager-879f6c89f-l5tpz\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.007577 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4d293772-95ef-4025-ab46-0b150e902a0f-etcd-service-ca\") pod \"etcd-operator-b45778765-zkq2l\" (UID: \"4d293772-95ef-4025-ab46-0b150e902a0f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.007644 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bccb2393-5218-4cd0-9b7e-c9d19eab391b-socket-dir\") pod \"csi-hostpathplugin-5vvnq\" (UID: \"bccb2393-5218-4cd0-9b7e-c9d19eab391b\") " pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.007747 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/02e5f7a9-a93a-4e64-83ee-ab23c12ef67f-signing-cabundle\") pod \"service-ca-9c57cc56f-lxk9g\" (UID: \"02e5f7a9-a93a-4e64-83ee-ab23c12ef67f\") " pod="openshift-service-ca/service-ca-9c57cc56f-lxk9g" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.007759 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/d90f3655-18d4-4dec-b9d4-7309fa424c4e-ready\") pod \"cni-sysctl-allowlist-ds-74srj\" (UID: \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.007810 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e190b12e-4fed-4d2d-9e8d-3a8b30c60175-config\") pod \"service-ca-operator-777779d784-rhtg6\" (UID: \"e190b12e-4fed-4d2d-9e8d-3a8b30c60175\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rhtg6" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.008186 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4d293772-95ef-4025-ab46-0b150e902a0f-etcd-ca\") pod \"etcd-operator-b45778765-zkq2l\" (UID: \"4d293772-95ef-4025-ab46-0b150e902a0f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.008282 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e40c792e-b179-402f-82d0-9744288c0680-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gc6fl\" (UID: \"e40c792e-b179-402f-82d0-9744288c0680\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gc6fl" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.009072 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f800f28a-197d-4e69-b66d-3840065b674e-metrics-tls\") pod \"dns-default-wjxqk\" (UID: \"f800f28a-197d-4e69-b66d-3840065b674e\") " pod="openshift-dns/dns-default-wjxqk" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.010005 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/390724a0-ca5c-4309-93a5-13aa44b32831-serving-cert\") pod \"controller-manager-879f6c89f-l5tpz\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.010505 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/081b2ab6-1edc-4ec5-9054-367d9f5951af-cert\") pod \"ingress-canary-z9fxv\" (UID: \"081b2ab6-1edc-4ec5-9054-367d9f5951af\") " pod="openshift-ingress-canary/ingress-canary-z9fxv" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.010543 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43747056-18ae-4153-9d16-9d4f4330ddb3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75b6q\" (UID: \"43747056-18ae-4153-9d16-9d4f4330ddb3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75b6q" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.011061 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e9b6a05d-0104-46ac-aa11-2e13f0369b1f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-g72br\" (UID: \"e9b6a05d-0104-46ac-aa11-2e13f0369b1f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g72br" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.011247 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4d293772-95ef-4025-ab46-0b150e902a0f-etcd-client\") pod \"etcd-operator-b45778765-zkq2l\" (UID: \"4d293772-95ef-4025-ab46-0b150e902a0f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.011288 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7180535e-ac7a-4999-a926-3a6ffe02852c-node-bootstrap-token\") pod \"machine-config-server-94np6\" (UID: \"7180535e-ac7a-4999-a926-3a6ffe02852c\") " pod="openshift-machine-config-operator/machine-config-server-94np6" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.011409 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d293772-95ef-4025-ab46-0b150e902a0f-serving-cert\") pod \"etcd-operator-b45778765-zkq2l\" (UID: \"4d293772-95ef-4025-ab46-0b150e902a0f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.011661 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7180535e-ac7a-4999-a926-3a6ffe02852c-certs\") pod \"machine-config-server-94np6\" (UID: \"7180535e-ac7a-4999-a926-3a6ffe02852c\") " pod="openshift-machine-config-operator/machine-config-server-94np6" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.011973 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e190b12e-4fed-4d2d-9e8d-3a8b30c60175-serving-cert\") pod \"service-ca-operator-777779d784-rhtg6\" (UID: \"e190b12e-4fed-4d2d-9e8d-3a8b30c60175\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rhtg6" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.063703 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js2kz\" (UniqueName: \"kubernetes.io/projected/02e5f7a9-a93a-4e64-83ee-ab23c12ef67f-kube-api-access-js2kz\") pod \"service-ca-9c57cc56f-lxk9g\" (UID: \"02e5f7a9-a93a-4e64-83ee-ab23c12ef67f\") " pod="openshift-service-ca/service-ca-9c57cc56f-lxk9g" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.080317 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc67k\" (UniqueName: \"kubernetes.io/projected/390724a0-ca5c-4309-93a5-13aa44b32831-kube-api-access-gc67k\") pod \"controller-manager-879f6c89f-l5tpz\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.099683 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2x75\" (UniqueName: \"kubernetes.io/projected/f800f28a-197d-4e69-b66d-3840065b674e-kube-api-access-m2x75\") pod \"dns-default-wjxqk\" (UID: \"f800f28a-197d-4e69-b66d-3840065b674e\") " pod="openshift-dns/dns-default-wjxqk" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.102162 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:14 crc kubenswrapper[4693]: E1204 09:43:14.102301 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:14.602272396 +0000 UTC m=+40.499866149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.102447 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:14 crc kubenswrapper[4693]: E1204 09:43:14.102940 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:14.602927844 +0000 UTC m=+40.500521597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.113144 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm66h\" (UniqueName: \"kubernetes.io/projected/29687875-23eb-403d-a89f-eb4d32092d7e-kube-api-access-rm66h\") pod \"marketplace-operator-79b997595-jt5h7\" (UID: \"29687875-23eb-403d-a89f-eb4d32092d7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.126971 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.129947 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpr68\" (UniqueName: \"kubernetes.io/projected/d90f3655-18d4-4dec-b9d4-7309fa424c4e-kube-api-access-cpr68\") pod \"cni-sysctl-allowlist-ds-74srj\" (UID: \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.144703 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kn9rb"] Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.146508 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-lxk9g" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.148758 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlcff\" (UniqueName: \"kubernetes.io/projected/e190b12e-4fed-4d2d-9e8d-3a8b30c60175-kube-api-access-wlcff\") pod \"service-ca-operator-777779d784-rhtg6\" (UID: \"e190b12e-4fed-4d2d-9e8d-3a8b30c60175\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rhtg6" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.174232 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.185211 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zntf6"] Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.185429 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wjxqk" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.192092 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq"] Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.198610 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-sg997" event={"ID":"57ffaa1b-fb6e-4c7b-8809-ca9bbd8f527e","Type":"ContainerStarted","Data":"6d5785ae52b05c77ac49eb1c207c51ce104c68779518f7dd067ca7dfd6ff78b4"} Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.199418 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9x82p" event={"ID":"f30a48a3-da96-4844-8a96-3478db0a7018","Type":"ContainerStarted","Data":"62c56777edd53edd274dbf313e9d488f84bbd731b28560d43df174881e286d5e"} Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.200213 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-gh7dl" event={"ID":"46a329a4-a450-4e39-bcbe-c7dcba1e6939","Type":"ContainerStarted","Data":"003303c552ed7073d9f9129c1c734c41c4f267f6d5a41dc8e3107db92900e508"} Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.201124 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" event={"ID":"764c0924-2f3b-4341-9922-a22d2f3cf145","Type":"ContainerStarted","Data":"9cfef2b33f61e61ae0e8a757a9418b45119913a0a50464f3eea21c5888ae754a"} Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.201947 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" event={"ID":"68219a06-b58a-4d36-b851-32dd1e4a2ec5","Type":"ContainerStarted","Data":"8e21fa0f0d1ae5a6f363f2557d09422747dff2e715af252b826e3e78292a54a1"} Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.205307 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kkwtj" event={"ID":"5c8b51b7-718d-43b5-9e18-58966747279f","Type":"ContainerStarted","Data":"520046b4ec6980f9cb08d44bb1ae7d5fbe88788d0ff147a04dfdc9a014ae4a14"} Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.205382 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:14 crc kubenswrapper[4693]: E1204 09:43:14.205559 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:14.705519185 +0000 UTC m=+40.603112938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.205673 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:14 crc kubenswrapper[4693]: E1204 09:43:14.206087 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:14.70606007 +0000 UTC m=+40.603653823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.208679 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-55r8b" event={"ID":"833b50c0-572f-4534-8a96-af514ff81953","Type":"ContainerStarted","Data":"729f4798051616c779eb981068f929e08d543120b292ff888a5b88c47c9a4a20"} Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.209738 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" event={"ID":"ca1189f1-0dee-459a-bb4f-dfda69f2eee1","Type":"ContainerStarted","Data":"8694a622de8902730303ba29b6a90b6154b3021bbd39fc4dbd7d3df83d229329"} Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.211753 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43747056-18ae-4153-9d16-9d4f4330ddb3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75b6q\" (UID: \"43747056-18ae-4153-9d16-9d4f4330ddb3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75b6q" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.213502 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5" event={"ID":"e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8","Type":"ContainerStarted","Data":"920c056da4690868725104e69ebf00b8d2f8d8f207a06c8424f6a89ae9caa7d7"} Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.217646 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw"] Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.228851 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8xrf\" (UniqueName: \"kubernetes.io/projected/7180535e-ac7a-4999-a926-3a6ffe02852c-kube-api-access-f8xrf\") pod \"machine-config-server-94np6\" (UID: \"7180535e-ac7a-4999-a926-3a6ffe02852c\") " pod="openshift-machine-config-operator/machine-config-server-94np6" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.272772 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp5kb\" (UniqueName: \"kubernetes.io/projected/e9b6a05d-0104-46ac-aa11-2e13f0369b1f-kube-api-access-jp5kb\") pod \"package-server-manager-789f6589d5-g72br\" (UID: \"e9b6a05d-0104-46ac-aa11-2e13f0369b1f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g72br" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.307008 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:14 crc kubenswrapper[4693]: E1204 09:43:14.307581 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:14.807562232 +0000 UTC m=+40.705155985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.308244 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zjrqj"] Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.327436 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfvdf\" (UniqueName: \"kubernetes.io/projected/081b2ab6-1edc-4ec5-9054-367d9f5951af-kube-api-access-pfvdf\") pod \"ingress-canary-z9fxv\" (UID: \"081b2ab6-1edc-4ec5-9054-367d9f5951af\") " pod="openshift-ingress-canary/ingress-canary-z9fxv" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.354993 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tzx57"] Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.367201 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vflmh"] Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.373354 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw"] Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.383891 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hzbkw"] Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.385785 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mq9k6"] Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.403877 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65lkg"] Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.408597 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:14 crc kubenswrapper[4693]: E1204 09:43:14.408993 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:14.908974502 +0000 UTC m=+40.806568255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.433834 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rhtg6" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.455604 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-mhzjn"] Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.455650 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld"] Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.461153 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g72br" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.494032 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75b6q" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.500492 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z9fxv" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.504099 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk"] Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.506586 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zctpg"] Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.507447 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-94np6" Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.509124 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:14 crc kubenswrapper[4693]: E1204 09:43:14.509365 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:15.009322553 +0000 UTC m=+40.906916346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.513742 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qn6gc"] Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.610663 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:14 crc kubenswrapper[4693]: E1204 09:43:14.611086 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:15.111065612 +0000 UTC m=+41.008659455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.711642 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:14 crc kubenswrapper[4693]: E1204 09:43:14.711895 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:15.211862485 +0000 UTC m=+41.109456278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.712316 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:14 crc kubenswrapper[4693]: E1204 09:43:14.712600 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:15.212588134 +0000 UTC m=+41.110181887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.813700 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:14 crc kubenswrapper[4693]: E1204 09:43:14.813974 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:15.313929933 +0000 UTC m=+41.211523726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.814147 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:14 crc kubenswrapper[4693]: E1204 09:43:14.814628 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:15.314605371 +0000 UTC m=+41.212199164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.915637 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:14 crc kubenswrapper[4693]: E1204 09:43:14.915794 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:15.415765893 +0000 UTC m=+41.313359646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:14 crc kubenswrapper[4693]: I1204 09:43:14.915910 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:14 crc kubenswrapper[4693]: E1204 09:43:14.916188 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:15.416169555 +0000 UTC m=+41.313763318 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:15 crc kubenswrapper[4693]: I1204 09:43:15.017323 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:15 crc kubenswrapper[4693]: E1204 09:43:15.017754 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:15.517741028 +0000 UTC m=+41.415334771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:15 crc kubenswrapper[4693]: I1204 09:43:15.119010 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:15 crc kubenswrapper[4693]: E1204 09:43:15.119303 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:15.619290332 +0000 UTC m=+41.516884085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:15 crc kubenswrapper[4693]: I1204 09:43:15.220370 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:15 crc kubenswrapper[4693]: E1204 09:43:15.220566 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:15.720531627 +0000 UTC m=+41.618125420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:15 crc kubenswrapper[4693]: I1204 09:43:15.220851 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:15 crc kubenswrapper[4693]: E1204 09:43:15.221309 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:15.721219076 +0000 UTC m=+41.618812929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:15 crc kubenswrapper[4693]: I1204 09:43:15.322210 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:15 crc kubenswrapper[4693]: E1204 09:43:15.322424 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:15.822391029 +0000 UTC m=+41.719984812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:15 crc kubenswrapper[4693]: I1204 09:43:15.322503 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:15 crc kubenswrapper[4693]: E1204 09:43:15.322951 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:15.822933794 +0000 UTC m=+41.720527587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:15 crc kubenswrapper[4693]: I1204 09:43:15.424321 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:15 crc kubenswrapper[4693]: E1204 09:43:15.425864 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:15.925822064 +0000 UTC m=+41.823415867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:15 crc kubenswrapper[4693]: I1204 09:43:15.426382 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:15 crc kubenswrapper[4693]: E1204 09:43:15.426749 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:15.926731187 +0000 UTC m=+41.824324940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:15 crc kubenswrapper[4693]: I1204 09:43:15.528380 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:15 crc kubenswrapper[4693]: E1204 09:43:15.528664 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:16.028639271 +0000 UTC m=+41.926233034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:15 crc kubenswrapper[4693]: I1204 09:43:15.528811 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:15 crc kubenswrapper[4693]: E1204 09:43:15.529691 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:16.029626918 +0000 UTC m=+41.927220701 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:15 crc kubenswrapper[4693]: I1204 09:43:15.631886 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:15 crc kubenswrapper[4693]: E1204 09:43:15.632472 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:16.132452115 +0000 UTC m=+42.030045868 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:15 crc kubenswrapper[4693]: I1204 09:43:15.733554 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:15 crc kubenswrapper[4693]: E1204 09:43:15.733910 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:16.233891906 +0000 UTC m=+42.131485649 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:15 crc kubenswrapper[4693]: I1204 09:43:15.838810 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:15 crc kubenswrapper[4693]: E1204 09:43:15.839115 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:16.339070797 +0000 UTC m=+42.236664590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:15 crc kubenswrapper[4693]: I1204 09:43:15.839725 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:15 crc kubenswrapper[4693]: E1204 09:43:15.840153 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:16.340131796 +0000 UTC m=+42.237725589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:15 crc kubenswrapper[4693]: I1204 09:43:15.940898 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:15 crc kubenswrapper[4693]: E1204 09:43:15.941136 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:16.441086354 +0000 UTC m=+42.338680147 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:15 crc kubenswrapper[4693]: I1204 09:43:15.941586 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:15 crc kubenswrapper[4693]: E1204 09:43:15.942651 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:16.442627055 +0000 UTC m=+42.340220848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.043451 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:16 crc kubenswrapper[4693]: E1204 09:43:16.043642 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:16.543608493 +0000 UTC m=+42.441202236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.043826 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:16 crc kubenswrapper[4693]: E1204 09:43:16.044287 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:16.54427787 +0000 UTC m=+42.441871693 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.145438 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:16 crc kubenswrapper[4693]: E1204 09:43:16.145701 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:16.64566901 +0000 UTC m=+42.543262773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.145864 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:16 crc kubenswrapper[4693]: E1204 09:43:16.146250 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:16.646230735 +0000 UTC m=+42.543824498 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.247582 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:16 crc kubenswrapper[4693]: E1204 09:43:16.247937 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:16.747895742 +0000 UTC m=+42.645489535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.248100 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:16 crc kubenswrapper[4693]: E1204 09:43:16.248607 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:16.748559339 +0000 UTC m=+42.646153092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.349708 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:16 crc kubenswrapper[4693]: E1204 09:43:16.349982 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:16.849928998 +0000 UTC m=+42.747522761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.350052 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:16 crc kubenswrapper[4693]: E1204 09:43:16.350663 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:16.850651627 +0000 UTC m=+42.748245390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.451093 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:16 crc kubenswrapper[4693]: E1204 09:43:16.451302 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:16.951279596 +0000 UTC m=+42.848873369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.451539 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.451641 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6954da61-bafb-4b35-aa61-0f120c34c747-metrics-certs\") pod \"network-metrics-daemon-kncc4\" (UID: \"6954da61-bafb-4b35-aa61-0f120c34c747\") " pod="openshift-multus/network-metrics-daemon-kncc4" Dec 04 09:43:16 crc kubenswrapper[4693]: E1204 09:43:16.451826 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:16.951818351 +0000 UTC m=+42.849412104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.455736 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6954da61-bafb-4b35-aa61-0f120c34c747-metrics-certs\") pod \"network-metrics-daemon-kncc4\" (UID: \"6954da61-bafb-4b35-aa61-0f120c34c747\") " pod="openshift-multus/network-metrics-daemon-kncc4" Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.552221 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:16 crc kubenswrapper[4693]: E1204 09:43:16.552499 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:17.05246849 +0000 UTC m=+42.950062253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.552657 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:16 crc kubenswrapper[4693]: E1204 09:43:16.553006 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:17.052988424 +0000 UTC m=+42.950582187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.653852 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:16 crc kubenswrapper[4693]: E1204 09:43:16.654124 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:17.154092496 +0000 UTC m=+43.051686259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.654534 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:16 crc kubenswrapper[4693]: E1204 09:43:16.654912 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:17.154895788 +0000 UTC m=+43.052489551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.677641 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kncc4" Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.755987 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:16 crc kubenswrapper[4693]: E1204 09:43:16.756298 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:17.256262186 +0000 UTC m=+43.153855989 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.756705 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:16 crc kubenswrapper[4693]: E1204 09:43:16.757310 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:17.257280934 +0000 UTC m=+43.154874737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.857747 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:16 crc kubenswrapper[4693]: E1204 09:43:16.857940 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:17.357912313 +0000 UTC m=+43.255506066 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.858090 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:16 crc kubenswrapper[4693]: E1204 09:43:16.858505 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:17.358490808 +0000 UTC m=+43.256084641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:16 crc kubenswrapper[4693]: I1204 09:43:16.958792 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:16 crc kubenswrapper[4693]: E1204 09:43:16.959784 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:17.459756203 +0000 UTC m=+43.357349986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:17 crc kubenswrapper[4693]: I1204 09:43:17.061296 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:17 crc kubenswrapper[4693]: E1204 09:43:17.061980 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:17.561966105 +0000 UTC m=+43.459559868 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:17 crc kubenswrapper[4693]: I1204 09:43:17.162836 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:17 crc kubenswrapper[4693]: E1204 09:43:17.163577 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:17.66355186 +0000 UTC m=+43.561145653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:17 crc kubenswrapper[4693]: I1204 09:43:17.264697 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:17 crc kubenswrapper[4693]: E1204 09:43:17.265039 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:17.765026861 +0000 UTC m=+43.662620614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:17 crc kubenswrapper[4693]: I1204 09:43:17.365859 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:17 crc kubenswrapper[4693]: E1204 09:43:17.366654 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:17.866631346 +0000 UTC m=+43.764225119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:17 crc kubenswrapper[4693]: I1204 09:43:17.467597 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:17 crc kubenswrapper[4693]: E1204 09:43:17.468327 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:17.968289622 +0000 UTC m=+43.865883415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:17 crc kubenswrapper[4693]: I1204 09:43:17.569250 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:17 crc kubenswrapper[4693]: E1204 09:43:17.569431 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:18.069402804 +0000 UTC m=+43.966996577 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:17 crc kubenswrapper[4693]: I1204 09:43:17.569575 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:17 crc kubenswrapper[4693]: E1204 09:43:17.569894 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:18.069879347 +0000 UTC m=+43.967473170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:17 crc kubenswrapper[4693]: I1204 09:43:17.670656 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:17 crc kubenswrapper[4693]: E1204 09:43:17.671016 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:18.170982188 +0000 UTC m=+44.068575951 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:17 crc kubenswrapper[4693]: I1204 09:43:17.671429 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:17 crc kubenswrapper[4693]: E1204 09:43:17.671885 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:18.171873862 +0000 UTC m=+44.069467625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:17 crc kubenswrapper[4693]: I1204 09:43:17.772116 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:17 crc kubenswrapper[4693]: E1204 09:43:17.772268 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:18.272235704 +0000 UTC m=+44.169829497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:17 crc kubenswrapper[4693]: I1204 09:43:17.772638 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:17 crc kubenswrapper[4693]: E1204 09:43:17.773075 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:18.273058706 +0000 UTC m=+44.170652469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:17 crc kubenswrapper[4693]: I1204 09:43:17.874065 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:17 crc kubenswrapper[4693]: E1204 09:43:17.874314 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:18.37428083 +0000 UTC m=+44.271874583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:17 crc kubenswrapper[4693]: I1204 09:43:17.874536 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:17 crc kubenswrapper[4693]: E1204 09:43:17.875020 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:18.37499707 +0000 UTC m=+44.272590863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:17 crc kubenswrapper[4693]: I1204 09:43:17.976236 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:17 crc kubenswrapper[4693]: E1204 09:43:17.976538 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:18.476501812 +0000 UTC m=+44.374095605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:17 crc kubenswrapper[4693]: I1204 09:43:17.977058 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:17 crc kubenswrapper[4693]: E1204 09:43:17.977634 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:18.477609132 +0000 UTC m=+44.375202925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.078431 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:18 crc kubenswrapper[4693]: E1204 09:43:18.078810 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:18.578766655 +0000 UTC m=+44.476360438 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.179759 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:18 crc kubenswrapper[4693]: E1204 09:43:18.180367 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:18.680310288 +0000 UTC m=+44.577904071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.281509 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:18 crc kubenswrapper[4693]: E1204 09:43:18.281755 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:18.781719938 +0000 UTC m=+44.679313691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.282049 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:18 crc kubenswrapper[4693]: E1204 09:43:18.282640 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:18.782628972 +0000 UTC m=+44.680222815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.310912 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/955f4b1c-a9cc-42e0-bbbb-456dde994dcf-metrics-tls\") pod \"dns-operator-744455d44c-6jrqg\" (UID: \"955f4b1c-a9cc-42e0-bbbb-456dde994dcf\") " pod="openshift-dns-operator/dns-operator-744455d44c-6jrqg" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.311451 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3ce889a4-48b5-429d-8d0e-fc270a53385b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.311457 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hndvd\" (UniqueName: \"kubernetes.io/projected/955f4b1c-a9cc-42e0-bbbb-456dde994dcf-kube-api-access-hndvd\") pod \"dns-operator-744455d44c-6jrqg\" (UID: \"955f4b1c-a9cc-42e0-bbbb-456dde994dcf\") " pod="openshift-dns-operator/dns-operator-744455d44c-6jrqg" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.312507 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec65ec10-8b3d-4f00-8662-ee8ee7cbd533-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7jlpl\" (UID: \"ec65ec10-8b3d-4f00-8662-ee8ee7cbd533\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7jlpl" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.312958 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3ce889a4-48b5-429d-8d0e-fc270a53385b-registry-certificates\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.313167 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e92c71e-1bcc-455f-a270-1dd051662af6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-n9k5f\" (UID: \"3e92c71e-1bcc-455f-a270-1dd051662af6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n9k5f" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.315459 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ce889a4-48b5-429d-8d0e-fc270a53385b-bound-sa-token\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.315939 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ce889a4-48b5-429d-8d0e-fc270a53385b-trusted-ca\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.316280 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3ce889a4-48b5-429d-8d0e-fc270a53385b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.316373 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qzkp\" (UniqueName: \"kubernetes.io/projected/4d293772-95ef-4025-ab46-0b150e902a0f-kube-api-access-9qzkp\") pod \"etcd-operator-b45778765-zkq2l\" (UID: \"4d293772-95ef-4025-ab46-0b150e902a0f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.316859 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec65ec10-8b3d-4f00-8662-ee8ee7cbd533-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7jlpl\" (UID: \"ec65ec10-8b3d-4f00-8662-ee8ee7cbd533\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7jlpl" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.317536 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5qz9\" (UniqueName: \"kubernetes.io/projected/bccb2393-5218-4cd0-9b7e-c9d19eab391b-kube-api-access-l5qz9\") pod \"csi-hostpathplugin-5vvnq\" (UID: \"bccb2393-5218-4cd0-9b7e-c9d19eab391b\") " pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.318018 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/29687875-23eb-403d-a89f-eb4d32092d7e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jt5h7\" (UID: \"29687875-23eb-403d-a89f-eb4d32092d7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.318090 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e40c792e-b179-402f-82d0-9744288c0680-proxy-tls\") pod \"machine-config-controller-84d6567774-gc6fl\" (UID: \"e40c792e-b179-402f-82d0-9744288c0680\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gc6fl" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.318216 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ce889a4-48b5-429d-8d0e-fc270a53385b-registry-tls\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.319052 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gstcd\" (UniqueName: \"kubernetes.io/projected/e40c792e-b179-402f-82d0-9744288c0680-kube-api-access-gstcd\") pod \"machine-config-controller-84d6567774-gc6fl\" (UID: \"e40c792e-b179-402f-82d0-9744288c0680\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gc6fl" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.321857 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54llr\" (UniqueName: \"kubernetes.io/projected/3e92c71e-1bcc-455f-a270-1dd051662af6-kube-api-access-54llr\") pod \"control-plane-machine-set-operator-78cbb6b69f-n9k5f\" (UID: \"3e92c71e-1bcc-455f-a270-1dd051662af6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n9k5f" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.322688 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59sj2\" (UniqueName: \"kubernetes.io/projected/3ce889a4-48b5-429d-8d0e-fc270a53385b-kube-api-access-59sj2\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.340829 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.353868 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gc6fl" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.369097 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6jrqg" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.386276 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:18 crc kubenswrapper[4693]: E1204 09:43:18.386654 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:18.886634602 +0000 UTC m=+44.784228355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.428769 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" Dec 04 09:43:18 crc kubenswrapper[4693]: W1204 09:43:18.441929 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7180535e_ac7a_4999_a926_3a6ffe02852c.slice/crio-ff6fca3fef727693b28ab76d7c30550d4593a153d4c9c69264b131aedc06c1c7 WatchSource:0}: Error finding container ff6fca3fef727693b28ab76d7c30550d4593a153d4c9c69264b131aedc06c1c7: Status 404 returned error can't find the container with id ff6fca3fef727693b28ab76d7c30550d4593a153d4c9c69264b131aedc06c1c7 Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.488121 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7jlpl" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.488120 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:18 crc kubenswrapper[4693]: E1204 09:43:18.489162 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:18.989149382 +0000 UTC m=+44.886743135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.589412 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:18 crc kubenswrapper[4693]: E1204 09:43:18.589565 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:19.089547524 +0000 UTC m=+44.987141277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.592475 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:18 crc kubenswrapper[4693]: E1204 09:43:18.592950 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:19.092933196 +0000 UTC m=+44.990526949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.611832 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.618527 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n9k5f" Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.643005 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75b6q"] Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.693670 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:18 crc kubenswrapper[4693]: E1204 09:43:18.693829 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:19.193799651 +0000 UTC m=+45.091393424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.694048 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:18 crc kubenswrapper[4693]: E1204 09:43:18.694484 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:19.194463468 +0000 UTC m=+45.092057251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.795642 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:18 crc kubenswrapper[4693]: E1204 09:43:18.795840 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:19.295812897 +0000 UTC m=+45.193406680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.796040 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:18 crc kubenswrapper[4693]: E1204 09:43:18.796306 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:19.29629852 +0000 UTC m=+45.193892273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.897986 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:18 crc kubenswrapper[4693]: E1204 09:43:18.898182 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:19.398142151 +0000 UTC m=+45.295735954 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.898462 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:18 crc kubenswrapper[4693]: E1204 09:43:18.899103 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:19.399079096 +0000 UTC m=+45.296672889 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:18 crc kubenswrapper[4693]: I1204 09:43:18.999779 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:18 crc kubenswrapper[4693]: E1204 09:43:18.999893 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:19.499875029 +0000 UTC m=+45.397468782 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.000254 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:19 crc kubenswrapper[4693]: E1204 09:43:19.000616 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:19.500603119 +0000 UTC m=+45.398196882 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.085077 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-rhtg6"] Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.102081 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:19 crc kubenswrapper[4693]: E1204 09:43:19.102300 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:19.602274335 +0000 UTC m=+45.499868088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.102489 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:19 crc kubenswrapper[4693]: E1204 09:43:19.102866 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:19.602848111 +0000 UTC m=+45.500441874 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:19 crc kubenswrapper[4693]: W1204 09:43:19.124164 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43747056_18ae_4153_9d16_9d4f4330ddb3.slice/crio-bb94c12dde51ec9bcde18103f8e8a23dd5f9c3b51442fd915fc59d2c7aebd547 WatchSource:0}: Error finding container bb94c12dde51ec9bcde18103f8e8a23dd5f9c3b51442fd915fc59d2c7aebd547: Status 404 returned error can't find the container with id bb94c12dde51ec9bcde18103f8e8a23dd5f9c3b51442fd915fc59d2c7aebd547 Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.203245 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:19 crc kubenswrapper[4693]: E1204 09:43:19.205391 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:19.705371131 +0000 UTC m=+45.602964884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.235258 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zntf6" event={"ID":"a96a1772-5a62-4b57-bc83-4faa4dcb1260","Type":"ContainerStarted","Data":"37c75a8b313345ec8bf265cf469119d26bd6b845c874cca0e71db1381d68e24b"} Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.236111 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-94np6" event={"ID":"7180535e-ac7a-4999-a926-3a6ffe02852c","Type":"ContainerStarted","Data":"ff6fca3fef727693b28ab76d7c30550d4593a153d4c9c69264b131aedc06c1c7"} Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.237502 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65lkg" event={"ID":"1a2d7ab3-ae2e-4c16-a1c9-97997ded9506","Type":"ContainerStarted","Data":"4aae577e05c90335193a5da6774a9f4a56346cfb4773ebe357d5de34baf0fd5c"} Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.238159 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld" event={"ID":"39c1db76-c273-46ee-a00e-3dae4dc1ed6b","Type":"ContainerStarted","Data":"1099f91b73abdb29d556be88b2de2b28f5bc4862eea47cfc905629e2d8b30167"} Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.238721 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" event={"ID":"a0d11105-8b91-4aad-8c32-dfe2ef976028","Type":"ContainerStarted","Data":"d7385c08a959df6c132ec94795f3d4d8e7bf14e88b515b629233b89a0673bf26"} Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.239366 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw" event={"ID":"36782e8d-b271-46a5-8f96-8979022991f2","Type":"ContainerStarted","Data":"05c72c11de2e122a295b7ae1b6ebc604d5d981e53181835892c218179020e66a"} Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.239925 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw" event={"ID":"57e20f16-dfe1-45b4-8b13-860011aac931","Type":"ContainerStarted","Data":"eb2668c41341825932003bffb9e4f8e91d2bf185d7efce1af115435675ced9f2"} Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.240676 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qn6gc" event={"ID":"8d1cb4d7-e1d9-4755-bee4-c571f8cffcba","Type":"ContainerStarted","Data":"8672d7144338879bbab9a7fdc94ea722826a5038474de95031fceca447575326"} Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.241725 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mq9k6" event={"ID":"c52cdcb0-2915-493e-ab48-7c863e590ee2","Type":"ContainerStarted","Data":"3508c9b2bc32b98d8c7628f613fc2d90e8ffc8c79a1e32e3eb1a65a4daffada9"} Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.249768 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" event={"ID":"d5fc4cb9-89af-47dd-b38a-2378c774a9c5","Type":"ContainerStarted","Data":"6584f5400338249934e9c49c476fffb64adc44e94b1d2d1acf2c398fda953297"} Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.251571 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mhzjn" event={"ID":"18edbe10-dd1c-47a9-b8de-5f2d53306f2e","Type":"ContainerStarted","Data":"dee2200753a46f9a527a6d5d716f2e045742a754badd582ffa13e62bae80f73e"} Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.252743 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hzbkw" event={"ID":"debdfef3-4184-4a37-a818-e5c41c81e2fd","Type":"ContainerStarted","Data":"e0461b1c9c562b498cb9888e0e6117980dbceb98aa4b9255ab8888c5a5c7b658"} Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.254046 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" event={"ID":"a85a548f-a204-4573-ae20-c62a17e17df3","Type":"ContainerStarted","Data":"740fcfb126b55226432be7e4ed5536cad2accfd658b1a7b07555eab770208ab2"} Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.257086 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z9fxv"] Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.260758 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zctpg" event={"ID":"1c46aadf-ba22-4bdc-b76a-8b9ad8880368","Type":"ContainerStarted","Data":"d6a392e26b00e4d771815957ef85e42514b5d63524867b1a2d3cd986398d8edb"} Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.261871 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tzx57" event={"ID":"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb","Type":"ContainerStarted","Data":"1e9a31960d6a8a8e9cc4c77f9ef4b5e9c8d9b5779a8df657f66896450c95dfc6"} Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.262894 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zjrqj" event={"ID":"b3246be0-0c88-49c2-8cee-05c3661a509e","Type":"ContainerStarted","Data":"e3fced02bfba00ab53ce28a0cbeb5782177fd9afa52d1cda3e41deafdec0052b"} Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.263874 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75b6q" event={"ID":"43747056-18ae-4153-9d16-9d4f4330ddb3","Type":"ContainerStarted","Data":"bb94c12dde51ec9bcde18103f8e8a23dd5f9c3b51442fd915fc59d2c7aebd547"} Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.265013 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vflmh" event={"ID":"78528eec-0bba-40f0-9739-0a4e951d53da","Type":"ContainerStarted","Data":"ec235e8afb84d7a8826526a52a333c1d30d623501addbdc13b002c5d4f779217"} Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.280788 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kncc4"] Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.306797 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:19 crc kubenswrapper[4693]: E1204 09:43:19.307163 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:19.80714469 +0000 UTC m=+45.704738533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.308631 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l5tpz"] Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.344630 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6jrqg"] Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.408253 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:19 crc kubenswrapper[4693]: E1204 09:43:19.408607 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:19.90858271 +0000 UTC m=+45.806176463 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.510420 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:19 crc kubenswrapper[4693]: E1204 09:43:19.510783 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:20.010768421 +0000 UTC m=+45.908362174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.615807 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:19 crc kubenswrapper[4693]: E1204 09:43:19.632271 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:20.13220814 +0000 UTC m=+46.029801903 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.719671 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:19 crc kubenswrapper[4693]: E1204 09:43:19.720001 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:20.219989023 +0000 UTC m=+46.117582776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.822111 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:19 crc kubenswrapper[4693]: E1204 09:43:19.822616 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:20.322595905 +0000 UTC m=+46.220189658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.904320 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lxk9g"] Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.906977 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zkq2l"] Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.909131 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g72br"] Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.924442 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:19 crc kubenswrapper[4693]: E1204 09:43:19.925044 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:20.425029302 +0000 UTC m=+46.322623065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.964438 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wjxqk"] Dec 04 09:43:19 crc kubenswrapper[4693]: I1204 09:43:19.969045 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gc6fl"] Dec 04 09:43:20 crc kubenswrapper[4693]: W1204 09:43:20.020893 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d293772_95ef_4025_ab46_0b150e902a0f.slice/crio-309b0d7138f61a8a9b8810782133e31519127f6b3938b215bfdcb4747f5b4b8e WatchSource:0}: Error finding container 309b0d7138f61a8a9b8810782133e31519127f6b3938b215bfdcb4747f5b4b8e: Status 404 returned error can't find the container with id 309b0d7138f61a8a9b8810782133e31519127f6b3938b215bfdcb4747f5b4b8e Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.025021 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:20 crc kubenswrapper[4693]: E1204 09:43:20.025148 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:20.525128107 +0000 UTC m=+46.422721860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.025249 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:20 crc kubenswrapper[4693]: E1204 09:43:20.025565 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:20.525554468 +0000 UTC m=+46.423148221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:20 crc kubenswrapper[4693]: W1204 09:43:20.030844 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9b6a05d_0104_46ac_aa11_2e13f0369b1f.slice/crio-ef67db79170770cd482e2e0e5cb5e4b050089cad13dfab39fc8a4cd4cb1f772f WatchSource:0}: Error finding container ef67db79170770cd482e2e0e5cb5e4b050089cad13dfab39fc8a4cd4cb1f772f: Status 404 returned error can't find the container with id ef67db79170770cd482e2e0e5cb5e4b050089cad13dfab39fc8a4cd4cb1f772f Dec 04 09:43:20 crc kubenswrapper[4693]: W1204 09:43:20.032234 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf800f28a_197d_4e69_b66d_3840065b674e.slice/crio-4c64d2441d4780ec629bb19fcd6dd761b139596812e1a1bfbb5dd48ab3b4d8ca WatchSource:0}: Error finding container 4c64d2441d4780ec629bb19fcd6dd761b139596812e1a1bfbb5dd48ab3b4d8ca: Status 404 returned error can't find the container with id 4c64d2441d4780ec629bb19fcd6dd761b139596812e1a1bfbb5dd48ab3b4d8ca Dec 04 09:43:20 crc kubenswrapper[4693]: W1204 09:43:20.037784 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode40c792e_b179_402f_82d0_9744288c0680.slice/crio-ba3d38b74a1024acf81b2dc164381c6399b18e5ca075735d1b4bd61be910b9b3 WatchSource:0}: Error finding container ba3d38b74a1024acf81b2dc164381c6399b18e5ca075735d1b4bd61be910b9b3: Status 404 returned error can't find the container with id ba3d38b74a1024acf81b2dc164381c6399b18e5ca075735d1b4bd61be910b9b3 Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.125955 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:20 crc kubenswrapper[4693]: E1204 09:43:20.126086 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:20.626065744 +0000 UTC m=+46.523659497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.126265 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:20 crc kubenswrapper[4693]: E1204 09:43:20.126558 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:20.626550957 +0000 UTC m=+46.524144710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.226993 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:20 crc kubenswrapper[4693]: E1204 09:43:20.227198 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:20.727169495 +0000 UTC m=+46.624763248 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.227251 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:20 crc kubenswrapper[4693]: E1204 09:43:20.227779 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:20.727761761 +0000 UTC m=+46.625355514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.280937 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-55r8b" event={"ID":"833b50c0-572f-4534-8a96-af514ff81953","Type":"ContainerStarted","Data":"13270df9181c6674255bb54239b5ca7bc7164f71c3a39987631b70fe58a31da3"} Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.283502 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9x82p" event={"ID":"f30a48a3-da96-4844-8a96-3478db0a7018","Type":"ContainerStarted","Data":"38c0d57879661eed427d8427c2e3f0295dd7fb83564dd6ca629d3297e44a7e01"} Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.284557 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gc6fl" event={"ID":"e40c792e-b179-402f-82d0-9744288c0680","Type":"ContainerStarted","Data":"ba3d38b74a1024acf81b2dc164381c6399b18e5ca075735d1b4bd61be910b9b3"} Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.285378 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kncc4" event={"ID":"6954da61-bafb-4b35-aa61-0f120c34c747","Type":"ContainerStarted","Data":"b4ea4589cb221e8a4210c89f78312fa0296055725006691e1ead1b2b2fb0fffe"} Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.286935 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" event={"ID":"4d293772-95ef-4025-ab46-0b150e902a0f","Type":"ContainerStarted","Data":"309b0d7138f61a8a9b8810782133e31519127f6b3938b215bfdcb4747f5b4b8e"} Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.288158 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" event={"ID":"d90f3655-18d4-4dec-b9d4-7309fa424c4e","Type":"ContainerStarted","Data":"35f4c03094d794f3716e26019b15ea8cc5e156f196aebe8617bbf2f86eb1e06d"} Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.289954 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g72br" event={"ID":"e9b6a05d-0104-46ac-aa11-2e13f0369b1f","Type":"ContainerStarted","Data":"ef67db79170770cd482e2e0e5cb5e4b050089cad13dfab39fc8a4cd4cb1f772f"} Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.303671 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" event={"ID":"68219a06-b58a-4d36-b851-32dd1e4a2ec5","Type":"ContainerStarted","Data":"69183b0b7e2df556a610b3cf896e4abfe7e8bfed4dcbf458912594c4f9fdb1c1"} Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.304621 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z9fxv" event={"ID":"081b2ab6-1edc-4ec5-9054-367d9f5951af","Type":"ContainerStarted","Data":"87d677f27fef8c772551810d62ee755348f1a465558fd0ef2a6f13ca65157533"} Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.305333 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wjxqk" event={"ID":"f800f28a-197d-4e69-b66d-3840065b674e","Type":"ContainerStarted","Data":"4c64d2441d4780ec629bb19fcd6dd761b139596812e1a1bfbb5dd48ab3b4d8ca"} Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.307683 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rhtg6" event={"ID":"e190b12e-4fed-4d2d-9e8d-3a8b30c60175","Type":"ContainerStarted","Data":"d7dc1cf8beeddf32d20a44b64a76283cebb800d2d517fc1bf1a8cb7936d8a384"} Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.309281 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6jrqg" event={"ID":"955f4b1c-a9cc-42e0-bbbb-456dde994dcf","Type":"ContainerStarted","Data":"24c959f2e97a1364736fc5bf4940f0b6fe07833eedfb7be32e3d0dfa8f7e7bdf"} Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.310166 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-lxk9g" event={"ID":"02e5f7a9-a93a-4e64-83ee-ab23c12ef67f","Type":"ContainerStarted","Data":"1ca531bd20e1f2becc1c36fd1130fe6fb6f400eb422536ec2b63b16bb335124e"} Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.310993 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" event={"ID":"390724a0-ca5c-4309-93a5-13aa44b32831","Type":"ContainerStarted","Data":"fd66efc75ac9179220a8380e1f18f7a670e031731e81f5b56b2a351019ad53ec"} Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.328552 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:20 crc kubenswrapper[4693]: E1204 09:43:20.328982 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:20.828964476 +0000 UTC m=+46.726558229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.429698 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:20 crc kubenswrapper[4693]: E1204 09:43:20.430099 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:20.930085928 +0000 UTC m=+46.827679681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.530881 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:20 crc kubenswrapper[4693]: E1204 09:43:20.531184 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:21.031160058 +0000 UTC m=+46.928753811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.559990 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jt5h7"] Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.594595 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n9k5f"] Dec 04 09:43:20 crc kubenswrapper[4693]: W1204 09:43:20.614450 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29687875_23eb_403d_a89f_eb4d32092d7e.slice/crio-d0d110af8dd41ee6559d31f4c6ff175be9743720a0daa8ca5338725bb39a38e0 WatchSource:0}: Error finding container d0d110af8dd41ee6559d31f4c6ff175be9743720a0daa8ca5338725bb39a38e0: Status 404 returned error can't find the container with id d0d110af8dd41ee6559d31f4c6ff175be9743720a0daa8ca5338725bb39a38e0 Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.618569 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7jlpl"] Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.622105 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5vvnq"] Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.632281 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:20 crc kubenswrapper[4693]: E1204 09:43:20.632629 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:21.132614469 +0000 UTC m=+47.030208222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.734385 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:20 crc kubenswrapper[4693]: E1204 09:43:20.735032 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:21.235014066 +0000 UTC m=+47.132607819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.836241 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:20 crc kubenswrapper[4693]: E1204 09:43:20.836589 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:21.336577419 +0000 UTC m=+47.234171172 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:20 crc kubenswrapper[4693]: I1204 09:43:20.937688 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:20 crc kubenswrapper[4693]: E1204 09:43:20.938317 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:21.438294768 +0000 UTC m=+47.335888521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.039310 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:21 crc kubenswrapper[4693]: E1204 09:43:21.039994 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:21.539982024 +0000 UTC m=+47.437575767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.140849 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:21 crc kubenswrapper[4693]: E1204 09:43:21.141072 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:21.641039325 +0000 UTC m=+47.538633078 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.141327 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:21 crc kubenswrapper[4693]: E1204 09:43:21.141697 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:21.641685082 +0000 UTC m=+47.539278835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.242397 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:21 crc kubenswrapper[4693]: E1204 09:43:21.242556 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:21.742530746 +0000 UTC m=+47.640124489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.242697 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:21 crc kubenswrapper[4693]: E1204 09:43:21.243040 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:21.74303308 +0000 UTC m=+47.640626833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.321928 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gc6fl" event={"ID":"e40c792e-b179-402f-82d0-9744288c0680","Type":"ContainerStarted","Data":"534aa238a093476deb90168856c30ed99d3c3749cfc1ee511b8091881fb62d60"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.322973 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5fvns" event={"ID":"6a14b590-276f-49be-961a-c459c975c8ab","Type":"ContainerStarted","Data":"e3e364304839c6b6e0afaecc8f5d6affd0f50883bafeefdf5f568b025d718177"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.326799 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vflmh" event={"ID":"78528eec-0bba-40f0-9739-0a4e951d53da","Type":"ContainerStarted","Data":"050fe71662b474a007e73d4142b19229d33303d522833f3a9048f4a63f393e0a"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.337808 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-sg997" event={"ID":"57ffaa1b-fb6e-4c7b-8809-ca9bbd8f527e","Type":"ContainerStarted","Data":"464d8195d5aeb018c88a4bb19ead60a23415928f53a9703a871303d3c34248a4"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.338230 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-sg997" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.341176 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-sg997 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.341232 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sg997" podUID="57ffaa1b-fb6e-4c7b-8809-ca9bbd8f527e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.343475 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:21 crc kubenswrapper[4693]: E1204 09:43:21.343684 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:21.843662639 +0000 UTC m=+47.741256392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.343890 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:21 crc kubenswrapper[4693]: E1204 09:43:21.344173 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:21.844166243 +0000 UTC m=+47.741759996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.344269 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" event={"ID":"d90f3655-18d4-4dec-b9d4-7309fa424c4e","Type":"ContainerStarted","Data":"0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.351739 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zntf6" event={"ID":"a96a1772-5a62-4b57-bc83-4faa4dcb1260","Type":"ContainerStarted","Data":"80fd6d97ddcc3f3502b774473ddd72e9e17b39c304b6687ba4fbbc7fb9035870"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.358020 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rhtg6" event={"ID":"e190b12e-4fed-4d2d-9e8d-3a8b30c60175","Type":"ContainerStarted","Data":"266658c0d6a8484b332b9bf96e62ff6038dcd65483765b8d1c35f784bda73ab6"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.360228 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kncc4" event={"ID":"6954da61-bafb-4b35-aa61-0f120c34c747","Type":"ContainerStarted","Data":"ebee6660fcb316cec27045dcbb151fb83cb885345a22537ab87f4b439bb2fbbb"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.367528 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vflmh" podStartSLOduration=25.367511831 podStartE2EDuration="25.367511831s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:21.366986237 +0000 UTC m=+47.264579990" watchObservedRunningTime="2025-12-04 09:43:21.367511831 +0000 UTC m=+47.265105584" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.393725 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-sg997" podStartSLOduration=25.393686315 podStartE2EDuration="25.393686315s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:21.390529871 +0000 UTC m=+47.288123614" watchObservedRunningTime="2025-12-04 09:43:21.393686315 +0000 UTC m=+47.291280068" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.406571 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g72br" event={"ID":"e9b6a05d-0104-46ac-aa11-2e13f0369b1f","Type":"ContainerStarted","Data":"4a9cc1fb3dc200b3b9df80292c232b2a96302ffac3623718a1dfd7b7ef62edba"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.414010 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" event={"ID":"a0d11105-8b91-4aad-8c32-dfe2ef976028","Type":"ContainerStarted","Data":"aa7d543f15797b11c35d1a328eea7681f8885be850b773a530d0276923f2a901"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.415215 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.425479 4693 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-pstsq container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.426025 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" podUID="a0d11105-8b91-4aad-8c32-dfe2ef976028" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.444918 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:21 crc kubenswrapper[4693]: E1204 09:43:21.446058 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:21.946043864 +0000 UTC m=+47.843637617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.460586 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z9fxv" event={"ID":"081b2ab6-1edc-4ec5-9054-367d9f5951af","Type":"ContainerStarted","Data":"adb0d5146f4e6f25c7e3c4790c2f4a2ba3b9863d527178dce2936d4ac716af26"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.466004 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" event={"ID":"bccb2393-5218-4cd0-9b7e-c9d19eab391b","Type":"ContainerStarted","Data":"d0274522b8a646ebceef146292550cc3812ed4dbd06f25abd2d3a010a81a6396"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.472382 4693 generic.go:334] "Generic (PLEG): container finished" podID="1c46aadf-ba22-4bdc-b76a-8b9ad8880368" containerID="846aa6fae2321397c140d4451f3d10328b0e8307436626d71cf5e16235a054df" exitCode=0 Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.472447 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zctpg" event={"ID":"1c46aadf-ba22-4bdc-b76a-8b9ad8880368","Type":"ContainerDied","Data":"846aa6fae2321397c140d4451f3d10328b0e8307436626d71cf5e16235a054df"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.478085 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n9k5f" event={"ID":"3e92c71e-1bcc-455f-a270-1dd051662af6","Type":"ContainerStarted","Data":"4de7cac2ce2d0b3ac5f5e0c34fbce4360b24a4f1f50e9c8d68074cd3db4005cd"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.480859 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5" event={"ID":"e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8","Type":"ContainerStarted","Data":"560104bda2fc0982bd34be9a2e569dac8e52c03db2fae0376654d879532494df"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.482204 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" event={"ID":"764c0924-2f3b-4341-9922-a22d2f3cf145","Type":"ContainerStarted","Data":"8cad583a494e5ac78d0833e3cc1cb17cd46d567d0f16c1fddeed6592200ae00b"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.483262 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.494347 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-lxk9g" event={"ID":"02e5f7a9-a93a-4e64-83ee-ab23c12ef67f","Type":"ContainerStarted","Data":"b1b1b924e558cf487346818b6d11d151ca80426171b03e26d597f3aae7ab7d2f"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.512806 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zntf6" podStartSLOduration=25.512780231 podStartE2EDuration="25.512780231s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:21.45067536 +0000 UTC m=+47.348269113" watchObservedRunningTime="2025-12-04 09:43:21.512780231 +0000 UTC m=+47.410373994" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.526829 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mhzjn" event={"ID":"18edbe10-dd1c-47a9-b8de-5f2d53306f2e","Type":"ContainerStarted","Data":"f6dbc0e3be88db0b7b40fc2d4a4ba40aac880b8dcee1c49fe74d6a8e5f94d9c5"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.539460 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" podStartSLOduration=25.539441148999998 podStartE2EDuration="25.539441149s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:21.503652706 +0000 UTC m=+47.401246459" watchObservedRunningTime="2025-12-04 09:43:21.539441149 +0000 UTC m=+47.437034902" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.545252 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kkwtj" event={"ID":"5c8b51b7-718d-43b5-9e18-58966747279f","Type":"ContainerStarted","Data":"9d3c885640e6eb106fb87caae4546b591eaf4ef319575edd1fdc9b75afa9abda"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.546189 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:21 crc kubenswrapper[4693]: E1204 09:43:21.548800 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:22.048640337 +0000 UTC m=+47.946234090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.553517 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-94np6" event={"ID":"7180535e-ac7a-4999-a926-3a6ffe02852c","Type":"ContainerStarted","Data":"ebbb11c8346bbb9037672e780f59f06c23941f8dca1246596357ebc41bc6b6bc"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.576020 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7jlpl" event={"ID":"ec65ec10-8b3d-4f00-8662-ee8ee7cbd533","Type":"ContainerStarted","Data":"5b0995f35b8fdcf5c5605a41c079ec0b560cbd6416a53faf2ceb3c508993b911"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.582436 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" podStartSLOduration=26.582403145 podStartE2EDuration="26.582403145s" podCreationTimestamp="2025-12-04 09:42:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:21.576011283 +0000 UTC m=+47.473605036" watchObservedRunningTime="2025-12-04 09:43:21.582403145 +0000 UTC m=+47.479996918" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.597187 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qn6gc" event={"ID":"8d1cb4d7-e1d9-4755-bee4-c571f8cffcba","Type":"ContainerStarted","Data":"d6797c8b21816b7ebb4657fd4b79e7c4304be49b52d1ec1a8031cec0ba4eb428"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.602270 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zjrqj" event={"ID":"b3246be0-0c88-49c2-8cee-05c3661a509e","Type":"ContainerStarted","Data":"b6737dd0041219e6f9f045adb1480cbbfa9b4d1167b13cadd733bac9c4affeed"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.603167 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-zjrqj" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.604748 4693 patch_prober.go:28] interesting pod/console-operator-58897d9998-zjrqj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/readyz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.604794 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zjrqj" podUID="b3246be0-0c88-49c2-8cee-05c3661a509e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/readyz\": dial tcp 10.217.0.5:8443: connect: connection refused" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.610129 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-gh7dl" event={"ID":"46a329a4-a450-4e39-bcbe-c7dcba1e6939","Type":"ContainerStarted","Data":"824d0c30c90c031c04eab4a88ede5dc50d05d2483f2c21fd9d727028f82ca34d"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.614700 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-94np6" podStartSLOduration=10.614682444 podStartE2EDuration="10.614682444s" podCreationTimestamp="2025-12-04 09:43:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:21.613370019 +0000 UTC m=+47.510963772" watchObservedRunningTime="2025-12-04 09:43:21.614682444 +0000 UTC m=+47.512276197" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.619455 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" event={"ID":"390724a0-ca5c-4309-93a5-13aa44b32831","Type":"ContainerStarted","Data":"138202350c7d907279449314b3ae20cf646daebd3b4d027e6c73a0dc2788a760"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.620409 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.621855 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw" event={"ID":"36782e8d-b271-46a5-8f96-8979022991f2","Type":"ContainerStarted","Data":"dab9c1011ea07d88de65b3480f049b963e02c9aff2d843592df61f618cb2f585"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.624542 4693 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-l5tpz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.624589 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" podUID="390724a0-ca5c-4309-93a5-13aa44b32831" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.626450 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" event={"ID":"a85a548f-a204-4573-ae20-c62a17e17df3","Type":"ContainerStarted","Data":"be4ce9e7f44041e4d900bd09ef5b1a8b1642f614b4c4e6f863dc151bad2a6e15"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.626967 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.634412 4693 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7jdxk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.634490 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" podUID="a85a548f-a204-4573-ae20-c62a17e17df3" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.637403 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-mhzjn" podStartSLOduration=25.637365995 podStartE2EDuration="25.637365995s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:21.636321716 +0000 UTC m=+47.533915469" watchObservedRunningTime="2025-12-04 09:43:21.637365995 +0000 UTC m=+47.534959778" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.643835 4693 generic.go:334] "Generic (PLEG): container finished" podID="c7e74be3-f8f4-4f94-8f11-657cb2c75ceb" containerID="d227aeb4a48c88c66df5238ca42c4c26c8870681d12e754e46704152d7824cf1" exitCode=0 Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.644770 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tzx57" event={"ID":"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb","Type":"ContainerDied","Data":"d227aeb4a48c88c66df5238ca42c4c26c8870681d12e754e46704152d7824cf1"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.650834 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:21 crc kubenswrapper[4693]: E1204 09:43:21.661707 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:22.161651638 +0000 UTC m=+48.059245381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.721958 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" event={"ID":"4d293772-95ef-4025-ab46-0b150e902a0f","Type":"ContainerStarted","Data":"9a874db219581f60c889fd7fa47569ea5390bb76d83bdb3037dc79612d4492c6"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.736133 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw" podStartSLOduration=25.736113563 podStartE2EDuration="25.736113563s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:21.673157789 +0000 UTC m=+47.570751552" watchObservedRunningTime="2025-12-04 09:43:21.736113563 +0000 UTC m=+47.633707316" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.754315 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.751103 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" podStartSLOduration=25.751082835 podStartE2EDuration="25.751082835s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:21.749615526 +0000 UTC m=+47.647209269" watchObservedRunningTime="2025-12-04 09:43:21.751082835 +0000 UTC m=+47.648676588" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.776450 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" podStartSLOduration=25.776431207999998 podStartE2EDuration="25.776431208s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:21.772337087 +0000 UTC m=+47.669930840" watchObservedRunningTime="2025-12-04 09:43:21.776431208 +0000 UTC m=+47.674024961" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.779703 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw" event={"ID":"57e20f16-dfe1-45b4-8b13-860011aac931","Type":"ContainerStarted","Data":"e578d42b9de018f28c62ac12f188f9682779dad346949abedf58aa4164dc59a8"} Dec 04 09:43:21 crc kubenswrapper[4693]: E1204 09:43:21.782722 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:22.282704107 +0000 UTC m=+48.180297860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.791702 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hzbkw" event={"ID":"debdfef3-4184-4a37-a818-e5c41c81e2fd","Type":"ContainerStarted","Data":"b96bc1adda3220346da59cca21b14ce7b9c2c5eae2aa20f542bebd441eb396ee"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.811627 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" event={"ID":"29687875-23eb-403d-a89f-eb4d32092d7e","Type":"ContainerStarted","Data":"d0d110af8dd41ee6559d31f4c6ff175be9743720a0daa8ca5338725bb39a38e0"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.812914 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-zjrqj" podStartSLOduration=25.812899419 podStartE2EDuration="25.812899419s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:21.810737452 +0000 UTC m=+47.708331205" watchObservedRunningTime="2025-12-04 09:43:21.812899419 +0000 UTC m=+47.710493172" Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.846751 4693 generic.go:334] "Generic (PLEG): container finished" podID="ca1189f1-0dee-459a-bb4f-dfda69f2eee1" containerID="9dacd8482e243dbd53ccfdba59a86ff676cb21ccdffd81323623e2d528684209" exitCode=0 Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.846864 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" event={"ID":"ca1189f1-0dee-459a-bb4f-dfda69f2eee1","Type":"ContainerDied","Data":"9dacd8482e243dbd53ccfdba59a86ff676cb21ccdffd81323623e2d528684209"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.857555 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:21 crc kubenswrapper[4693]: E1204 09:43:21.858086 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:22.358068256 +0000 UTC m=+48.255662009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.918648 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6jrqg" event={"ID":"955f4b1c-a9cc-42e0-bbbb-456dde994dcf","Type":"ContainerStarted","Data":"851093f5513394209d00c29d6535a20a75646f5670f3037865ad387a81b58e1b"} Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.966469 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:21 crc kubenswrapper[4693]: E1204 09:43:21.967900 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:22.467872091 +0000 UTC m=+48.365465844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:21 crc kubenswrapper[4693]: I1204 09:43:21.999515 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65lkg" event={"ID":"1a2d7ab3-ae2e-4c16-a1c9-97997ded9506","Type":"ContainerStarted","Data":"5563038e1ec6bc6f0bef63b2e3d9f1b8e903f830fd601d8ea1dfb0a6a7b85160"} Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.000958 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65lkg" Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.006411 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld" event={"ID":"39c1db76-c273-46ee-a00e-3dae4dc1ed6b","Type":"ContainerStarted","Data":"b00fe003e47302407de052a9f2317ea7da2b70db5c0070c8a94d3863a5d3d2fd"} Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.019180 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75b6q" event={"ID":"43747056-18ae-4153-9d16-9d4f4330ddb3","Type":"ContainerStarted","Data":"d852432fb87b7009d9ee6f8f0bb70037128c4f8a2e6837327c66ec49a7920525"} Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.022380 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65lkg" podStartSLOduration=26.022366568 podStartE2EDuration="26.022366568s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:22.020837117 +0000 UTC m=+47.918430870" watchObservedRunningTime="2025-12-04 09:43:22.022366568 +0000 UTC m=+47.919960321" Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.043654 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-65lkg" Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.050839 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75b6q" podStartSLOduration=26.050820224 podStartE2EDuration="26.050820224s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:22.050245779 +0000 UTC m=+47.947839532" watchObservedRunningTime="2025-12-04 09:43:22.050820224 +0000 UTC m=+47.948413977" Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.063731 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mq9k6" event={"ID":"c52cdcb0-2915-493e-ab48-7c863e590ee2","Type":"ContainerStarted","Data":"b4db989df98d6d377c7c99614cdc02ef9f2de0e4efe904380512f217ec752ba8"} Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.067457 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:22 crc kubenswrapper[4693]: E1204 09:43:22.068915 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:22.56889438 +0000 UTC m=+48.466488123 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.094094 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" event={"ID":"d5fc4cb9-89af-47dd-b38a-2378c774a9c5","Type":"ContainerStarted","Data":"25a01a1fb0a28c0e26f90aedbe8c5f18d00ebb95aa4c165fbdd0ae13bd3d1e41"} Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.095506 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.102750 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.121905 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfjld" podStartSLOduration=26.121882307 podStartE2EDuration="26.121882307s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:22.086122315 +0000 UTC m=+47.983716068" watchObservedRunningTime="2025-12-04 09:43:22.121882307 +0000 UTC m=+48.019476060" Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.123047 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mq9k6" podStartSLOduration=26.123040868 podStartE2EDuration="26.123040868s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:22.11384147 +0000 UTC m=+48.011435223" watchObservedRunningTime="2025-12-04 09:43:22.123040868 +0000 UTC m=+48.020634621" Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.169740 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:22 crc kubenswrapper[4693]: E1204 09:43:22.173774 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:22.673761803 +0000 UTC m=+48.571355556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.184148 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" podStartSLOduration=25.184134833 podStartE2EDuration="25.184134833s" podCreationTimestamp="2025-12-04 09:42:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:22.182109018 +0000 UTC m=+48.079702771" watchObservedRunningTime="2025-12-04 09:43:22.184134833 +0000 UTC m=+48.081728586" Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.262692 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-55r8b" podStartSLOduration=26.262674367 podStartE2EDuration="26.262674367s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:22.262617726 +0000 UTC m=+48.160211489" watchObservedRunningTime="2025-12-04 09:43:22.262674367 +0000 UTC m=+48.160268110" Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.262961 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9x82p" podStartSLOduration=27.262956544 podStartE2EDuration="27.262956544s" podCreationTimestamp="2025-12-04 09:42:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:22.221039386 +0000 UTC m=+48.118633139" watchObservedRunningTime="2025-12-04 09:43:22.262956544 +0000 UTC m=+48.160550297" Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.271803 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:22 crc kubenswrapper[4693]: E1204 09:43:22.272032 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:22.771999687 +0000 UTC m=+48.669593460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.272189 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:22 crc kubenswrapper[4693]: E1204 09:43:22.272624 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:22.772614584 +0000 UTC m=+48.670208337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.302375 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-kn9rb" podStartSLOduration=27.302358505 podStartE2EDuration="27.302358505s" podCreationTimestamp="2025-12-04 09:42:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:22.297000811 +0000 UTC m=+48.194594564" watchObservedRunningTime="2025-12-04 09:43:22.302358505 +0000 UTC m=+48.199952258" Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.373008 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:22 crc kubenswrapper[4693]: E1204 09:43:22.373400 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:22.873377477 +0000 UTC m=+48.770971220 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.475011 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:22 crc kubenswrapper[4693]: E1204 09:43:22.475890 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:22.975876986 +0000 UTC m=+48.873470739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.483088 4693 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-jznlz container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.483150 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" podUID="764c0924-2f3b-4341-9922-a22d2f3cf145" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.499156 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.523545 4693 patch_prober.go:28] interesting pod/router-default-5444994796-55r8b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:43:22 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Dec 04 09:43:22 crc kubenswrapper[4693]: [+]process-running ok Dec 04 09:43:22 crc kubenswrapper[4693]: healthz check failed Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.523603 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55r8b" podUID="833b50c0-572f-4534-8a96-af514ff81953" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.576661 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:22 crc kubenswrapper[4693]: E1204 09:43:22.577052 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:23.077037139 +0000 UTC m=+48.974630892 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.678374 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:22 crc kubenswrapper[4693]: E1204 09:43:22.678656 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:23.178643394 +0000 UTC m=+49.076237147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.778850 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:22 crc kubenswrapper[4693]: E1204 09:43:22.779169 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:23.279128309 +0000 UTC m=+49.176722102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.879918 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:22 crc kubenswrapper[4693]: E1204 09:43:22.880391 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:23.380370724 +0000 UTC m=+49.277964487 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:22 crc kubenswrapper[4693]: I1204 09:43:22.980974 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:22 crc kubenswrapper[4693]: E1204 09:43:22.981470 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:23.481452685 +0000 UTC m=+49.379046448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.082202 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:23 crc kubenswrapper[4693]: E1204 09:43:23.082587 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:23.582573536 +0000 UTC m=+49.480167289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.100212 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qn6gc" event={"ID":"8d1cb4d7-e1d9-4755-bee4-c571f8cffcba","Type":"ContainerStarted","Data":"0af3830c061af0c6913d7cf9b7d56356090693b56d167d8a93e09570523e92ec"} Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.102368 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" event={"ID":"29687875-23eb-403d-a89f-eb4d32092d7e","Type":"ContainerStarted","Data":"5ea074f2e9de7abe08b01bac6fcba3e76c3c10e1c958067b46d7e6459f8f9d25"} Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.103627 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-gh7dl" event={"ID":"46a329a4-a450-4e39-bcbe-c7dcba1e6939","Type":"ContainerStarted","Data":"51fed99080e26c8a74bf1e304c343c81b8b4b2a986b78475fa83ca64c20f2fce"} Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.105219 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zctpg" event={"ID":"1c46aadf-ba22-4bdc-b76a-8b9ad8880368","Type":"ContainerStarted","Data":"9145fa0f9da868245785c1a084364c8ba37014ea2027ded742c945d3553dfec5"} Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.106305 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5" event={"ID":"e59e3e31-dd03-4e17-b1ee-5d8fc57b05a8","Type":"ContainerStarted","Data":"de0dbb3da40cf593f841abef22b8ba70ce39e6e6f08052d599944e3ffec55779"} Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.107785 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g72br" event={"ID":"e9b6a05d-0104-46ac-aa11-2e13f0369b1f","Type":"ContainerStarted","Data":"ee1872a49dfb201eba349cbc73f1481d3944f60974290c853c704e3d9ab8f2d1"} Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.108311 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g72br" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.109420 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7jlpl" event={"ID":"ec65ec10-8b3d-4f00-8662-ee8ee7cbd533","Type":"ContainerStarted","Data":"c3009863964e1c6069cd14cde013f9f0f8e24751917950373961d94f9af569c8"} Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.112247 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n9k5f" event={"ID":"3e92c71e-1bcc-455f-a270-1dd051662af6","Type":"ContainerStarted","Data":"b628d0d9415cc60be182178fad21932e158ba1e21f1222726494d869e4708a91"} Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.116423 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5fvns" event={"ID":"6a14b590-276f-49be-961a-c459c975c8ab","Type":"ContainerStarted","Data":"c1988ac47aa78707c3715ab98c842ff3cfc5e638162e121fd8d2f691bea57170"} Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.118517 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wjxqk" event={"ID":"f800f28a-197d-4e69-b66d-3840065b674e","Type":"ContainerStarted","Data":"a9bf3734dca11453008137d002dbd12ff1a071cb032f66cfdbb779bf0b34f0e5"} Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.127592 4693 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-pstsq container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.127621 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-sg997 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.127651 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" podUID="a0d11105-8b91-4aad-8c32-dfe2ef976028" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.127692 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sg997" podUID="57ffaa1b-fb6e-4c7b-8809-ca9bbd8f527e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.127737 4693 patch_prober.go:28] interesting pod/console-operator-58897d9998-zjrqj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/readyz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.127835 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zjrqj" podUID="b3246be0-0c88-49c2-8cee-05c3661a509e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/readyz\": dial tcp 10.217.0.5:8443: connect: connection refused" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.128225 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.138405 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qn6gc" podStartSLOduration=27.138386299 podStartE2EDuration="27.138386299s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:23.116732597 +0000 UTC m=+49.014326380" watchObservedRunningTime="2025-12-04 09:43:23.138386299 +0000 UTC m=+49.035980062" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.139979 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n9k5f" podStartSLOduration=27.139966861 podStartE2EDuration="27.139966861s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:23.137383342 +0000 UTC m=+49.034977095" watchObservedRunningTime="2025-12-04 09:43:23.139966861 +0000 UTC m=+49.037560614" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.163573 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g72br" podStartSLOduration=27.163559277 podStartE2EDuration="27.163559277s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:23.161894342 +0000 UTC m=+49.059488115" watchObservedRunningTime="2025-12-04 09:43:23.163559277 +0000 UTC m=+49.061153030" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.165764 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.177101 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wkns5" podStartSLOduration=27.177079011 podStartE2EDuration="27.177079011s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:23.176703131 +0000 UTC m=+49.074296884" watchObservedRunningTime="2025-12-04 09:43:23.177079011 +0000 UTC m=+49.074672784" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.183055 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:23 crc kubenswrapper[4693]: E1204 09:43:23.183297 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:23.683268297 +0000 UTC m=+49.580862080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.184456 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:23 crc kubenswrapper[4693]: E1204 09:43:23.189224 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:23.689202247 +0000 UTC m=+49.586796000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.210662 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" podStartSLOduration=12.210640304 podStartE2EDuration="12.210640304s" podCreationTimestamp="2025-12-04 09:43:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:23.207951602 +0000 UTC m=+49.105545355" watchObservedRunningTime="2025-12-04 09:43:23.210640304 +0000 UTC m=+49.108234067" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.225843 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-z9fxv" podStartSLOduration=12.225826613 podStartE2EDuration="12.225826613s" podCreationTimestamp="2025-12-04 09:43:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:23.224196439 +0000 UTC m=+49.121790192" watchObservedRunningTime="2025-12-04 09:43:23.225826613 +0000 UTC m=+49.123420356" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.274622 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-zkq2l" podStartSLOduration=27.274607926 podStartE2EDuration="27.274607926s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:23.258260476 +0000 UTC m=+49.155854229" watchObservedRunningTime="2025-12-04 09:43:23.274607926 +0000 UTC m=+49.172201679" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.285313 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:23 crc kubenswrapper[4693]: E1204 09:43:23.285574 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:23.78555026 +0000 UTC m=+49.683144003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.285652 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-sg997 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.285680 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sg997" podUID="57ffaa1b-fb6e-4c7b-8809-ca9bbd8f527e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.285839 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.286215 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-sg997 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Dec 04 09:43:23 crc kubenswrapper[4693]: E1204 09:43:23.286294 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:23.78628688 +0000 UTC m=+49.683880633 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.286350 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-sg997" podUID="57ffaa1b-fb6e-4c7b-8809-ca9bbd8f527e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.297618 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rhtg6" podStartSLOduration=26.297600555 podStartE2EDuration="26.297600555s" podCreationTimestamp="2025-12-04 09:42:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:23.276966869 +0000 UTC m=+49.174560622" watchObservedRunningTime="2025-12-04 09:43:23.297600555 +0000 UTC m=+49.195194308" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.387214 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:23 crc kubenswrapper[4693]: E1204 09:43:23.387685 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:23.887659419 +0000 UTC m=+49.785253172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.417538 4693 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-pstsq container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.417625 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" podUID="a0d11105-8b91-4aad-8c32-dfe2ef976028" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.417712 4693 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-pstsq container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.417740 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" podUID="a0d11105-8b91-4aad-8c32-dfe2ef976028" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.464836 4693 patch_prober.go:28] interesting pod/console-operator-58897d9998-zjrqj container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.464911 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-zjrqj" podUID="b3246be0-0c88-49c2-8cee-05c3661a509e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.464937 4693 patch_prober.go:28] interesting pod/console-operator-58897d9998-zjrqj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/readyz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.465410 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zjrqj" podUID="b3246be0-0c88-49c2-8cee-05c3661a509e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/readyz\": dial tcp 10.217.0.5:8443: connect: connection refused" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.489507 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:23 crc kubenswrapper[4693]: E1204 09:43:23.489921 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:23.989904681 +0000 UTC m=+49.887498434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.499148 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.501756 4693 patch_prober.go:28] interesting pod/router-default-5444994796-55r8b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:43:23 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Dec 04 09:43:23 crc kubenswrapper[4693]: [+]process-running ok Dec 04 09:43:23 crc kubenswrapper[4693]: healthz check failed Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.501989 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55r8b" podUID="833b50c0-572f-4534-8a96-af514ff81953" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.590736 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:23 crc kubenswrapper[4693]: E1204 09:43:23.590910 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:24.090886059 +0000 UTC m=+49.988479812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.612586 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.612646 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.614675 4693 patch_prober.go:28] interesting pod/console-f9d7485db-mhzjn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.614746 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-mhzjn" podUID="18edbe10-dd1c-47a9-b8de-5f2d53306f2e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.692575 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:23 crc kubenswrapper[4693]: E1204 09:43:23.692915 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:24.192899105 +0000 UTC m=+50.090492858 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.794136 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:23 crc kubenswrapper[4693]: E1204 09:43:23.794322 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:24.294295645 +0000 UTC m=+50.191889398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.794483 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:23 crc kubenswrapper[4693]: E1204 09:43:23.794920 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:24.294908251 +0000 UTC m=+50.192501994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.895234 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:23 crc kubenswrapper[4693]: E1204 09:43:23.895435 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:24.395407937 +0000 UTC m=+50.293001690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.895519 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:23 crc kubenswrapper[4693]: E1204 09:43:23.895840 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:24.395833058 +0000 UTC m=+50.293426811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.938385 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-lxk9g" podStartSLOduration=26.938366502 podStartE2EDuration="26.938366502s" podCreationTimestamp="2025-12-04 09:42:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:23.297309617 +0000 UTC m=+49.194903370" watchObservedRunningTime="2025-12-04 09:43:23.938366502 +0000 UTC m=+49.835960255" Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.941396 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-74srj"] Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.996179 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:23 crc kubenswrapper[4693]: E1204 09:43:23.996365 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:24.496323763 +0000 UTC m=+50.393917516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:23 crc kubenswrapper[4693]: I1204 09:43:23.996556 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:23 crc kubenswrapper[4693]: E1204 09:43:23.996902 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:24.496893488 +0000 UTC m=+50.394487241 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.097760 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:24 crc kubenswrapper[4693]: E1204 09:43:24.097924 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:24.597899627 +0000 UTC m=+50.495493380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.098001 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:24 crc kubenswrapper[4693]: E1204 09:43:24.098312 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:24.598304958 +0000 UTC m=+50.495898711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.119350 4693 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-jznlz container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.119384 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" podUID="764c0924-2f3b-4341-9922-a22d2f3cf145" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.121659 4693 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7jdxk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.121717 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" podUID="a85a548f-a204-4573-ae20-c62a17e17df3" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.122054 4693 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-pstsq container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.122096 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" podUID="a0d11105-8b91-4aad-8c32-dfe2ef976028" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.122296 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-sg997 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.122324 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sg997" podUID="57ffaa1b-fb6e-4c7b-8809-ca9bbd8f527e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.122547 4693 patch_prober.go:28] interesting pod/console-operator-58897d9998-zjrqj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/readyz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.122580 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zjrqj" podUID="b3246be0-0c88-49c2-8cee-05c3661a509e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/readyz\": dial tcp 10.217.0.5:8443: connect: connection refused" Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.125976 4693 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-l5tpz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.126028 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" podUID="390724a0-ca5c-4309-93a5-13aa44b32831" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.198583 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:24 crc kubenswrapper[4693]: E1204 09:43:24.198715 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:24.69868434 +0000 UTC m=+50.596278093 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.200062 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:24 crc kubenswrapper[4693]: E1204 09:43:24.200423 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:24.700409627 +0000 UTC m=+50.598003380 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.301905 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:24 crc kubenswrapper[4693]: E1204 09:43:24.302113 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:24.802086323 +0000 UTC m=+50.699680076 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.302235 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:24 crc kubenswrapper[4693]: E1204 09:43:24.302529 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:24.802517606 +0000 UTC m=+50.700111359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.403574 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:24 crc kubenswrapper[4693]: E1204 09:43:24.403789 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:24.903762391 +0000 UTC m=+50.801356134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.404050 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:24 crc kubenswrapper[4693]: E1204 09:43:24.404353 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:24.904324635 +0000 UTC m=+50.801918388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.476245 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.502242 4693 patch_prober.go:28] interesting pod/router-default-5444994796-55r8b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:43:24 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Dec 04 09:43:24 crc kubenswrapper[4693]: [+]process-running ok Dec 04 09:43:24 crc kubenswrapper[4693]: healthz check failed Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.502338 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55r8b" podUID="833b50c0-572f-4534-8a96-af514ff81953" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.504785 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:24 crc kubenswrapper[4693]: E1204 09:43:24.505006 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:25.004980895 +0000 UTC m=+50.902574648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.505262 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:24 crc kubenswrapper[4693]: E1204 09:43:24.505608 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:25.005594902 +0000 UTC m=+50.903188655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.607069 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:24 crc kubenswrapper[4693]: E1204 09:43:24.607264 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:25.107237277 +0000 UTC m=+51.004831030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.607600 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:24 crc kubenswrapper[4693]: E1204 09:43:24.607909 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:25.107901366 +0000 UTC m=+51.005495119 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.703270 4693 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7jdxk container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.703328 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" podUID="a85a548f-a204-4573-ae20-c62a17e17df3" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.708074 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:24 crc kubenswrapper[4693]: E1204 09:43:24.708237 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:25.208186775 +0000 UTC m=+51.105780528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.708430 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:24 crc kubenswrapper[4693]: E1204 09:43:24.708778 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:25.208770221 +0000 UTC m=+51.106363974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.809248 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:24 crc kubenswrapper[4693]: E1204 09:43:24.809757 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:25.309623766 +0000 UTC m=+51.207217549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:24 crc kubenswrapper[4693]: I1204 09:43:24.911136 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:24 crc kubenswrapper[4693]: E1204 09:43:24.911567 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:25.411546099 +0000 UTC m=+51.309139852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.011906 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:25 crc kubenswrapper[4693]: E1204 09:43:25.012082 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:25.512047444 +0000 UTC m=+51.409641207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.012297 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:25 crc kubenswrapper[4693]: E1204 09:43:25.012606 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:25.512592269 +0000 UTC m=+51.410186022 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.113746 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:25 crc kubenswrapper[4693]: E1204 09:43:25.114138 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:25.614123362 +0000 UTC m=+51.511717115 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.119738 4693 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-jznlz container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.119861 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" podUID="764c0924-2f3b-4341-9922-a22d2f3cf145" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.121988 4693 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7jdxk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.122027 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" podUID="a85a548f-a204-4573-ae20-c62a17e17df3" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.125894 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" podUID="d90f3655-18d4-4dec-b9d4-7309fa424c4e" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" gracePeriod=30 Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.215656 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:25 crc kubenswrapper[4693]: E1204 09:43:25.216799 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:25.716788146 +0000 UTC m=+51.614381899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.316416 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:25 crc kubenswrapper[4693]: E1204 09:43:25.316538 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:25.81651035 +0000 UTC m=+51.714104103 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.316742 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:25 crc kubenswrapper[4693]: E1204 09:43:25.317063 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:25.817055594 +0000 UTC m=+51.714649347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.418242 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:25 crc kubenswrapper[4693]: E1204 09:43:25.418704 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:25.91868843 +0000 UTC m=+51.816282183 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.500722 4693 patch_prober.go:28] interesting pod/router-default-5444994796-55r8b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:43:25 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Dec 04 09:43:25 crc kubenswrapper[4693]: [+]process-running ok Dec 04 09:43:25 crc kubenswrapper[4693]: healthz check failed Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.500775 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55r8b" podUID="833b50c0-572f-4534-8a96-af514ff81953" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.519361 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:25 crc kubenswrapper[4693]: E1204 09:43:25.519747 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:26.01972849 +0000 UTC m=+51.917322303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.621390 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:25 crc kubenswrapper[4693]: E1204 09:43:25.621585 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:26.121550921 +0000 UTC m=+52.019144674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.621767 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:25 crc kubenswrapper[4693]: E1204 09:43:25.622062 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:26.122049034 +0000 UTC m=+52.019642777 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.723343 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:25 crc kubenswrapper[4693]: E1204 09:43:25.723539 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:26.223510295 +0000 UTC m=+52.121104048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.785169 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.824928 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:25 crc kubenswrapper[4693]: E1204 09:43:25.825419 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:26.325402858 +0000 UTC m=+52.222996611 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.925986 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:25 crc kubenswrapper[4693]: E1204 09:43:25.926220 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:26.426193811 +0000 UTC m=+52.323787564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:25 crc kubenswrapper[4693]: I1204 09:43:25.926487 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:25 crc kubenswrapper[4693]: E1204 09:43:25.926780 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:26.426765787 +0000 UTC m=+52.324359540 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.028289 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:26 crc kubenswrapper[4693]: E1204 09:43:26.028968 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:26.528947827 +0000 UTC m=+52.426541580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.094790 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jdxk" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.144498 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:26 crc kubenswrapper[4693]: E1204 09:43:26.145859 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:26.645839033 +0000 UTC m=+52.543432786 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.169369 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gc6fl" event={"ID":"e40c792e-b179-402f-82d0-9744288c0680","Type":"ContainerStarted","Data":"a7c19de87f53569ab314f1b4159e7162a195f7901cd2014d679f6ef72f06c688"} Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.201521 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tzx57" event={"ID":"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb","Type":"ContainerStarted","Data":"16c78eca500632863a594396d681fd8da56bf54e33c3c6a4774700d6a521e51b"} Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.218656 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6jrqg" event={"ID":"955f4b1c-a9cc-42e0-bbbb-456dde994dcf","Type":"ContainerStarted","Data":"eb7ca55a9097a86f60d726da76205b771de903ad4e4b43fd0ce6856695e228e1"} Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.245648 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:26 crc kubenswrapper[4693]: E1204 09:43:26.247236 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:26.747208932 +0000 UTC m=+52.644802745 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.248121 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gc6fl" podStartSLOduration=30.248107576 podStartE2EDuration="30.248107576s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:26.222145498 +0000 UTC m=+52.119739251" watchObservedRunningTime="2025-12-04 09:43:26.248107576 +0000 UTC m=+52.145701329" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.260937 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kncc4" event={"ID":"6954da61-bafb-4b35-aa61-0f120c34c747","Type":"ContainerStarted","Data":"347b784494811a961dfd42b0a628dccb2194c93b2ac6a78aab8a51481fe80bc0"} Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.289003 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wjxqk" event={"ID":"f800f28a-197d-4e69-b66d-3840065b674e","Type":"ContainerStarted","Data":"bb53a233eff02c2611e5679dbdb23deb98d47bb9e2b359bc97453f86dea1f3f0"} Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.289620 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-wjxqk" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.298659 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-6jrqg" podStartSLOduration=30.298639056 podStartE2EDuration="30.298639056s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:26.26051658 +0000 UTC m=+52.158110363" watchObservedRunningTime="2025-12-04 09:43:26.298639056 +0000 UTC m=+52.196232809" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.322634 4693 generic.go:334] "Generic (PLEG): container finished" podID="36782e8d-b271-46a5-8f96-8979022991f2" containerID="dab9c1011ea07d88de65b3480f049b963e02c9aff2d843592df61f618cb2f585" exitCode=0 Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.322732 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw" event={"ID":"36782e8d-b271-46a5-8f96-8979022991f2","Type":"ContainerDied","Data":"dab9c1011ea07d88de65b3480f049b963e02c9aff2d843592df61f618cb2f585"} Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.344986 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kkwtj" event={"ID":"5c8b51b7-718d-43b5-9e18-58966747279f","Type":"ContainerStarted","Data":"a2c357900c550fab92c1ca531570e9d251887f845d9138b8556e17fcf69ad986"} Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.348211 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:26 crc kubenswrapper[4693]: E1204 09:43:26.349349 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:26.849315 +0000 UTC m=+52.746908823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.359450 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kncc4" podStartSLOduration=30.359429193 podStartE2EDuration="30.359429193s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:26.298298267 +0000 UTC m=+52.195892020" watchObservedRunningTime="2025-12-04 09:43:26.359429193 +0000 UTC m=+52.257022946" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.359675 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wjxqk" podStartSLOduration=15.359669179 podStartE2EDuration="15.359669179s" podCreationTimestamp="2025-12-04 09:43:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:26.356438893 +0000 UTC m=+52.254032646" watchObservedRunningTime="2025-12-04 09:43:26.359669179 +0000 UTC m=+52.257262932" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.363731 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" event={"ID":"bccb2393-5218-4cd0-9b7e-c9d19eab391b","Type":"ContainerStarted","Data":"15cc0f66b4c59a20fe52e25e614358e6db3b30836441e0e4785d91171bd574f5"} Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.368308 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" event={"ID":"ca1189f1-0dee-459a-bb4f-dfda69f2eee1","Type":"ContainerStarted","Data":"f80af3ad600c8599ada289413c636a195a1eacdae6367a0cfc4e54f2ae1223bb"} Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.391681 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw" event={"ID":"57e20f16-dfe1-45b4-8b13-860011aac931","Type":"ContainerStarted","Data":"49422e20972f347f84602fd891e8e7eeda4ce8258c9a962149a7d21d1d1dfcd6"} Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.401922 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kkwtj" podStartSLOduration=31.401905556 podStartE2EDuration="31.401905556s" podCreationTimestamp="2025-12-04 09:42:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:26.392745639 +0000 UTC m=+52.290339392" watchObservedRunningTime="2025-12-04 09:43:26.401905556 +0000 UTC m=+52.299499309" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.428132 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hzbkw" event={"ID":"debdfef3-4184-4a37-a818-e5c41c81e2fd","Type":"ContainerStarted","Data":"42b58478f3033e5608dc4fe81770a9784025c1a16b245b222112723f4711b3d3"} Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.446814 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5fvns" event={"ID":"6a14b590-276f-49be-961a-c459c975c8ab","Type":"ContainerStarted","Data":"19338fc269158cdc67f94d630272e97567ddb7fa22afb29d7dc3298de6e30481"} Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.447950 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.447994 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zctpg" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.448856 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:26 crc kubenswrapper[4693]: E1204 09:43:26.449711 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:26.949697403 +0000 UTC m=+52.847291156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.460488 4693 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jt5h7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.460554 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" podUID="29687875-23eb-403d-a89f-eb4d32092d7e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.508533 4693 patch_prober.go:28] interesting pod/router-default-5444994796-55r8b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:43:26 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Dec 04 09:43:26 crc kubenswrapper[4693]: [+]process-running ok Dec 04 09:43:26 crc kubenswrapper[4693]: healthz check failed Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.508582 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55r8b" podUID="833b50c0-572f-4534-8a96-af514ff81953" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.513988 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.514665 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:43:26 crc kubenswrapper[4693]: W1204 09:43:26.534461 4693 reflector.go:561] object-"openshift-kube-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-kube-controller-manager": no relationship found between node 'crc' and this object Dec 04 09:43:26 crc kubenswrapper[4693]: E1204 09:43:26.534505 4693 reflector.go:158] "Unhandled Error" err="object-\"openshift-kube-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-kube-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 04 09:43:26 crc kubenswrapper[4693]: W1204 09:43:26.534529 4693 reflector.go:561] object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n": failed to list *v1.Secret: secrets "installer-sa-dockercfg-kjl2n" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-kube-controller-manager": no relationship found between node 'crc' and this object Dec 04 09:43:26 crc kubenswrapper[4693]: E1204 09:43:26.534598 4693 reflector.go:158] "Unhandled Error" err="object-\"openshift-kube-controller-manager\"/\"installer-sa-dockercfg-kjl2n\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"installer-sa-dockercfg-kjl2n\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-kube-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.549902 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.550129 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49094e9f-a512-4b40-9dd8-d3b77873cb71-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"49094e9f-a512-4b40-9dd8-d3b77873cb71\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.550198 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49094e9f-a512-4b40-9dd8-d3b77873cb71-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"49094e9f-a512-4b40-9dd8-d3b77873cb71\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:43:26 crc kubenswrapper[4693]: E1204 09:43:26.551731 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.051719329 +0000 UTC m=+52.949313082 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.581597 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.658660 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.658927 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49094e9f-a512-4b40-9dd8-d3b77873cb71-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"49094e9f-a512-4b40-9dd8-d3b77873cb71\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.658963 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49094e9f-a512-4b40-9dd8-d3b77873cb71-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"49094e9f-a512-4b40-9dd8-d3b77873cb71\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:43:26 crc kubenswrapper[4693]: E1204 09:43:26.659397 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.159379887 +0000 UTC m=+53.056973640 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.659429 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49094e9f-a512-4b40-9dd8-d3b77873cb71-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"49094e9f-a512-4b40-9dd8-d3b77873cb71\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.678969 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-hzbkw" podStartSLOduration=30.678947384 podStartE2EDuration="30.678947384s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:26.632276688 +0000 UTC m=+52.529870441" watchObservedRunningTime="2025-12-04 09:43:26.678947384 +0000 UTC m=+52.576541137" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.679173 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7jlpl" podStartSLOduration=30.67916905 podStartE2EDuration="30.67916905s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:26.677103274 +0000 UTC m=+52.574697027" watchObservedRunningTime="2025-12-04 09:43:26.67916905 +0000 UTC m=+52.576762803" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.705717 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" podStartSLOduration=30.705699514 podStartE2EDuration="30.705699514s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:26.702624911 +0000 UTC m=+52.600218664" watchObservedRunningTime="2025-12-04 09:43:26.705699514 +0000 UTC m=+52.603293267" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.726313 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" podStartSLOduration=30.726292358 podStartE2EDuration="30.726292358s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:26.724839319 +0000 UTC m=+52.622433072" watchObservedRunningTime="2025-12-04 09:43:26.726292358 +0000 UTC m=+52.623886111" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.759929 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:26 crc kubenswrapper[4693]: E1204 09:43:26.760279 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.260262533 +0000 UTC m=+53.157856336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.776289 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zctpg" podStartSLOduration=30.776273453 podStartE2EDuration="30.776273453s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:26.770786985 +0000 UTC m=+52.668380738" watchObservedRunningTime="2025-12-04 09:43:26.776273453 +0000 UTC m=+52.673867206" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.805588 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5fvns" podStartSLOduration=30.805569662 podStartE2EDuration="30.805569662s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:26.804431521 +0000 UTC m=+52.702025274" watchObservedRunningTime="2025-12-04 09:43:26.805569662 +0000 UTC m=+52.703163415" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.824353 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-gh7dl" podStartSLOduration=30.824312957 podStartE2EDuration="30.824312957s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:26.823209447 +0000 UTC m=+52.720803220" watchObservedRunningTime="2025-12-04 09:43:26.824312957 +0000 UTC m=+52.721906710" Dec 04 09:43:26 crc kubenswrapper[4693]: E1204 09:43:26.861654 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.36162136 +0000 UTC m=+53.259215113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.861327 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.866466 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:26 crc kubenswrapper[4693]: E1204 09:43:26.866827 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.36680723 +0000 UTC m=+53.264400983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.867504 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hz7pw" podStartSLOduration=30.867485649 podStartE2EDuration="30.867485649s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:26.866012649 +0000 UTC m=+52.763606402" watchObservedRunningTime="2025-12-04 09:43:26.867485649 +0000 UTC m=+52.765079402" Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.967765 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:26 crc kubenswrapper[4693]: E1204 09:43:26.967936 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.467903071 +0000 UTC m=+53.365496834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:26 crc kubenswrapper[4693]: I1204 09:43:26.968024 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:26 crc kubenswrapper[4693]: E1204 09:43:26.968310 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.468297193 +0000 UTC m=+53.365890946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.069442 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:27 crc kubenswrapper[4693]: E1204 09:43:27.069635 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.569604859 +0000 UTC m=+53.467198622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.069839 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:27 crc kubenswrapper[4693]: E1204 09:43:27.070182 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.570166844 +0000 UTC m=+53.467760597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.157822 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c7zjm"] Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.158750 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c7zjm" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.160307 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.170806 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:27 crc kubenswrapper[4693]: E1204 09:43:27.170991 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.670961937 +0000 UTC m=+53.568555720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.171129 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.171202 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:27 crc kubenswrapper[4693]: E1204 09:43:27.171436 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.67142183 +0000 UTC m=+53.569015653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.177820 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.202682 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c7zjm"] Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.272259 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:27 crc kubenswrapper[4693]: E1204 09:43:27.272396 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.772376168 +0000 UTC m=+53.669969921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.272579 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.272651 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.272676 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.272704 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c911e8-e91f-4c2d-9c38-d6bce33a819f-utilities\") pod \"community-operators-c7zjm\" (UID: \"a7c911e8-e91f-4c2d-9c38-d6bce33a819f\") " pod="openshift-marketplace/community-operators-c7zjm" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.272755 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.272809 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c911e8-e91f-4c2d-9c38-d6bce33a819f-catalog-content\") pod \"community-operators-c7zjm\" (UID: \"a7c911e8-e91f-4c2d-9c38-d6bce33a819f\") " pod="openshift-marketplace/community-operators-c7zjm" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.272841 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssh2h\" (UniqueName: \"kubernetes.io/projected/a7c911e8-e91f-4c2d-9c38-d6bce33a819f-kube-api-access-ssh2h\") pod \"community-operators-c7zjm\" (UID: \"a7c911e8-e91f-4c2d-9c38-d6bce33a819f\") " pod="openshift-marketplace/community-operators-c7zjm" Dec 04 09:43:27 crc kubenswrapper[4693]: E1204 09:43:27.273127 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.773116737 +0000 UTC m=+53.670710500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.273533 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.285475 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.294702 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.370381 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5rl2n"] Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.371257 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rl2n" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.373401 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.373677 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssh2h\" (UniqueName: \"kubernetes.io/projected/a7c911e8-e91f-4c2d-9c38-d6bce33a819f-kube-api-access-ssh2h\") pod \"community-operators-c7zjm\" (UID: \"a7c911e8-e91f-4c2d-9c38-d6bce33a819f\") " pod="openshift-marketplace/community-operators-c7zjm" Dec 04 09:43:27 crc kubenswrapper[4693]: E1204 09:43:27.373707 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.873685744 +0000 UTC m=+53.771279527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.373758 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c911e8-e91f-4c2d-9c38-d6bce33a819f-utilities\") pod \"community-operators-c7zjm\" (UID: \"a7c911e8-e91f-4c2d-9c38-d6bce33a819f\") " pod="openshift-marketplace/community-operators-c7zjm" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.373815 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.373863 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c911e8-e91f-4c2d-9c38-d6bce33a819f-catalog-content\") pod \"community-operators-c7zjm\" (UID: \"a7c911e8-e91f-4c2d-9c38-d6bce33a819f\") " pod="openshift-marketplace/community-operators-c7zjm" Dec 04 09:43:27 crc kubenswrapper[4693]: E1204 09:43:27.374253 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.874240389 +0000 UTC m=+53.771834142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.374527 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c911e8-e91f-4c2d-9c38-d6bce33a819f-catalog-content\") pod \"community-operators-c7zjm\" (UID: \"a7c911e8-e91f-4c2d-9c38-d6bce33a819f\") " pod="openshift-marketplace/community-operators-c7zjm" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.374605 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c911e8-e91f-4c2d-9c38-d6bce33a819f-utilities\") pod \"community-operators-c7zjm\" (UID: \"a7c911e8-e91f-4c2d-9c38-d6bce33a819f\") " pod="openshift-marketplace/community-operators-c7zjm" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.388032 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.415236 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5rl2n"] Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.425799 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssh2h\" (UniqueName: \"kubernetes.io/projected/a7c911e8-e91f-4c2d-9c38-d6bce33a819f-kube-api-access-ssh2h\") pod \"community-operators-c7zjm\" (UID: \"a7c911e8-e91f-4c2d-9c38-d6bce33a819f\") " pod="openshift-marketplace/community-operators-c7zjm" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.438965 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.447268 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.447893 4693 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-zctpg container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.447994 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zctpg" podUID="1c46aadf-ba22-4bdc-b76a-8b9ad8880368" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.475723 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c7zjm" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.476313 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.476544 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bedbed9-581a-414f-a92c-fc4933fcac93-utilities\") pod \"certified-operators-5rl2n\" (UID: \"2bedbed9-581a-414f-a92c-fc4933fcac93\") " pod="openshift-marketplace/certified-operators-5rl2n" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.477040 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bedbed9-581a-414f-a92c-fc4933fcac93-catalog-content\") pod \"certified-operators-5rl2n\" (UID: \"2bedbed9-581a-414f-a92c-fc4933fcac93\") " pod="openshift-marketplace/certified-operators-5rl2n" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.477077 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf9k8\" (UniqueName: \"kubernetes.io/projected/2bedbed9-581a-414f-a92c-fc4933fcac93-kube-api-access-cf9k8\") pod \"certified-operators-5rl2n\" (UID: \"2bedbed9-581a-414f-a92c-fc4933fcac93\") " pod="openshift-marketplace/certified-operators-5rl2n" Dec 04 09:43:27 crc kubenswrapper[4693]: E1204 09:43:27.477220 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:27.977203091 +0000 UTC m=+53.874796854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.490667 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.491274 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tzx57" event={"ID":"c7e74be3-f8f4-4f94-8f11-657cb2c75ceb","Type":"ContainerStarted","Data":"d5bc43c46a9849ab52adeee6a5d856757d04e0928d17c74b91de0f5b351bec65"} Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.499961 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.503554 4693 patch_prober.go:28] interesting pod/router-default-5444994796-55r8b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:43:27 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Dec 04 09:43:27 crc kubenswrapper[4693]: [+]process-running ok Dec 04 09:43:27 crc kubenswrapper[4693]: healthz check failed Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.503606 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55r8b" podUID="833b50c0-572f-4534-8a96-af514ff81953" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.542404 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l7q9x"] Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.543955 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7q9x" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.570999 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l7q9x"] Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.576614 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.577902 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bedbed9-581a-414f-a92c-fc4933fcac93-utilities\") pod \"certified-operators-5rl2n\" (UID: \"2bedbed9-581a-414f-a92c-fc4933fcac93\") " pod="openshift-marketplace/certified-operators-5rl2n" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.577931 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3f510e-344a-4433-bfbb-ed6c76c3a4b5-catalog-content\") pod \"community-operators-l7q9x\" (UID: \"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5\") " pod="openshift-marketplace/community-operators-l7q9x" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.577975 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.578209 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bedbed9-581a-414f-a92c-fc4933fcac93-catalog-content\") pod \"certified-operators-5rl2n\" (UID: \"2bedbed9-581a-414f-a92c-fc4933fcac93\") " pod="openshift-marketplace/certified-operators-5rl2n" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.578239 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf9k8\" (UniqueName: \"kubernetes.io/projected/2bedbed9-581a-414f-a92c-fc4933fcac93-kube-api-access-cf9k8\") pod \"certified-operators-5rl2n\" (UID: \"2bedbed9-581a-414f-a92c-fc4933fcac93\") " pod="openshift-marketplace/certified-operators-5rl2n" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.578280 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znt6q\" (UniqueName: \"kubernetes.io/projected/6a3f510e-344a-4433-bfbb-ed6c76c3a4b5-kube-api-access-znt6q\") pod \"community-operators-l7q9x\" (UID: \"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5\") " pod="openshift-marketplace/community-operators-l7q9x" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.578324 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3f510e-344a-4433-bfbb-ed6c76c3a4b5-utilities\") pod \"community-operators-l7q9x\" (UID: \"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5\") " pod="openshift-marketplace/community-operators-l7q9x" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.579376 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bedbed9-581a-414f-a92c-fc4933fcac93-utilities\") pod \"certified-operators-5rl2n\" (UID: \"2bedbed9-581a-414f-a92c-fc4933fcac93\") " pod="openshift-marketplace/certified-operators-5rl2n" Dec 04 09:43:27 crc kubenswrapper[4693]: E1204 09:43:27.580318 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:28.080302067 +0000 UTC m=+53.977895820 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.580764 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bedbed9-581a-414f-a92c-fc4933fcac93-catalog-content\") pod \"certified-operators-5rl2n\" (UID: \"2bedbed9-581a-414f-a92c-fc4933fcac93\") " pod="openshift-marketplace/certified-operators-5rl2n" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.615850 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf9k8\" (UniqueName: \"kubernetes.io/projected/2bedbed9-581a-414f-a92c-fc4933fcac93-kube-api-access-cf9k8\") pod \"certified-operators-5rl2n\" (UID: \"2bedbed9-581a-414f-a92c-fc4933fcac93\") " pod="openshift-marketplace/certified-operators-5rl2n" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.681917 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.682461 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3f510e-344a-4433-bfbb-ed6c76c3a4b5-utilities\") pod \"community-operators-l7q9x\" (UID: \"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5\") " pod="openshift-marketplace/community-operators-l7q9x" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.682509 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3f510e-344a-4433-bfbb-ed6c76c3a4b5-catalog-content\") pod \"community-operators-l7q9x\" (UID: \"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5\") " pod="openshift-marketplace/community-operators-l7q9x" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.682621 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znt6q\" (UniqueName: \"kubernetes.io/projected/6a3f510e-344a-4433-bfbb-ed6c76c3a4b5-kube-api-access-znt6q\") pod \"community-operators-l7q9x\" (UID: \"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5\") " pod="openshift-marketplace/community-operators-l7q9x" Dec 04 09:43:27 crc kubenswrapper[4693]: E1204 09:43:27.682728 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:28.182698293 +0000 UTC m=+54.080292056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.683234 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3f510e-344a-4433-bfbb-ed6c76c3a4b5-utilities\") pod \"community-operators-l7q9x\" (UID: \"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5\") " pod="openshift-marketplace/community-operators-l7q9x" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.683259 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3f510e-344a-4433-bfbb-ed6c76c3a4b5-catalog-content\") pod \"community-operators-l7q9x\" (UID: \"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5\") " pod="openshift-marketplace/community-operators-l7q9x" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.692930 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rl2n" Dec 04 09:43:27 crc kubenswrapper[4693]: E1204 09:43:27.700467 4693 projected.go:288] Couldn't get configMap openshift-kube-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 04 09:43:27 crc kubenswrapper[4693]: E1204 09:43:27.700521 4693 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-controller-manager/revision-pruner-9-crc: failed to sync configmap cache: timed out waiting for the condition Dec 04 09:43:27 crc kubenswrapper[4693]: E1204 09:43:27.700575 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/49094e9f-a512-4b40-9dd8-d3b77873cb71-kube-api-access podName:49094e9f-a512-4b40-9dd8-d3b77873cb71 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:28.200557913 +0000 UTC m=+54.098151656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/49094e9f-a512-4b40-9dd8-d3b77873cb71-kube-api-access") pod "revision-pruner-9-crc" (UID: "49094e9f-a512-4b40-9dd8-d3b77873cb71") : failed to sync configmap cache: timed out waiting for the condition Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.750949 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znt6q\" (UniqueName: \"kubernetes.io/projected/6a3f510e-344a-4433-bfbb-ed6c76c3a4b5-kube-api-access-znt6q\") pod \"community-operators-l7q9x\" (UID: \"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5\") " pod="openshift-marketplace/community-operators-l7q9x" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.784139 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:27 crc kubenswrapper[4693]: E1204 09:43:27.784486 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:28.284474552 +0000 UTC m=+54.182068305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.891829 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:27 crc kubenswrapper[4693]: E1204 09:43:27.892248 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:28.392215642 +0000 UTC m=+54.289809395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.892317 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7q9x" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.950532 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 04 09:43:27 crc kubenswrapper[4693]: I1204 09:43:27.993755 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:27 crc kubenswrapper[4693]: E1204 09:43:27.994156 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:28.494139756 +0000 UTC m=+54.391733509 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.094886 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:28 crc kubenswrapper[4693]: E1204 09:43:28.095256 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:28.595240997 +0000 UTC m=+54.492834750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.195948 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:28 crc kubenswrapper[4693]: E1204 09:43:28.196390 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:28.69637505 +0000 UTC m=+54.593968803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.233427 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nnfsl"] Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.238928 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnfsl" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.254424 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.254547 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.264758 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nnfsl"] Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.296877 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.297211 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99788\" (UniqueName: \"kubernetes.io/projected/ac8a0ee1-b340-421c-8496-74757e180a20-kube-api-access-99788\") pod \"certified-operators-nnfsl\" (UID: \"ac8a0ee1-b340-421c-8496-74757e180a20\") " pod="openshift-marketplace/certified-operators-nnfsl" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.297328 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8a0ee1-b340-421c-8496-74757e180a20-catalog-content\") pod \"certified-operators-nnfsl\" (UID: \"ac8a0ee1-b340-421c-8496-74757e180a20\") " pod="openshift-marketplace/certified-operators-nnfsl" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.297396 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8a0ee1-b340-421c-8496-74757e180a20-utilities\") pod \"certified-operators-nnfsl\" (UID: \"ac8a0ee1-b340-421c-8496-74757e180a20\") " pod="openshift-marketplace/certified-operators-nnfsl" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.297429 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49094e9f-a512-4b40-9dd8-d3b77873cb71-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"49094e9f-a512-4b40-9dd8-d3b77873cb71\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:43:28 crc kubenswrapper[4693]: E1204 09:43:28.297700 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:28.797671246 +0000 UTC m=+54.695264999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.333190 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.334089 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49094e9f-a512-4b40-9dd8-d3b77873cb71-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"49094e9f-a512-4b40-9dd8-d3b77873cb71\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.350581 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.386579 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.399027 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8a0ee1-b340-421c-8496-74757e180a20-catalog-content\") pod \"certified-operators-nnfsl\" (UID: \"ac8a0ee1-b340-421c-8496-74757e180a20\") " pod="openshift-marketplace/certified-operators-nnfsl" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.399078 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8a0ee1-b340-421c-8496-74757e180a20-utilities\") pod \"certified-operators-nnfsl\" (UID: \"ac8a0ee1-b340-421c-8496-74757e180a20\") " pod="openshift-marketplace/certified-operators-nnfsl" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.399110 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99788\" (UniqueName: \"kubernetes.io/projected/ac8a0ee1-b340-421c-8496-74757e180a20-kube-api-access-99788\") pod \"certified-operators-nnfsl\" (UID: \"ac8a0ee1-b340-421c-8496-74757e180a20\") " pod="openshift-marketplace/certified-operators-nnfsl" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.399147 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:28 crc kubenswrapper[4693]: E1204 09:43:28.399640 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:28.899628111 +0000 UTC m=+54.797221864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.400102 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8a0ee1-b340-421c-8496-74757e180a20-catalog-content\") pod \"certified-operators-nnfsl\" (UID: \"ac8a0ee1-b340-421c-8496-74757e180a20\") " pod="openshift-marketplace/certified-operators-nnfsl" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.400371 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8a0ee1-b340-421c-8496-74757e180a20-utilities\") pod \"certified-operators-nnfsl\" (UID: \"ac8a0ee1-b340-421c-8496-74757e180a20\") " pod="openshift-marketplace/certified-operators-nnfsl" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.424919 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zctpg" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.452102 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99788\" (UniqueName: \"kubernetes.io/projected/ac8a0ee1-b340-421c-8496-74757e180a20-kube-api-access-99788\") pod \"certified-operators-nnfsl\" (UID: \"ac8a0ee1-b340-421c-8496-74757e180a20\") " pod="openshift-marketplace/certified-operators-nnfsl" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.501864 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:28 crc kubenswrapper[4693]: E1204 09:43:28.511535 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:29.011506543 +0000 UTC m=+54.909100296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.515393 4693 patch_prober.go:28] interesting pod/router-default-5444994796-55r8b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:43:28 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Dec 04 09:43:28 crc kubenswrapper[4693]: [+]process-running ok Dec 04 09:43:28 crc kubenswrapper[4693]: healthz check failed Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.515458 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55r8b" podUID="833b50c0-572f-4534-8a96-af514ff81953" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.517607 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0889dd25525fbb1196a071646cbdd944797bad7f98e44885f95f880e842a39bc"} Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.529238 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.543246 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw" event={"ID":"36782e8d-b271-46a5-8f96-8979022991f2","Type":"ContainerDied","Data":"05c72c11de2e122a295b7ae1b6ebc604d5d981e53181835892c218179020e66a"} Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.543282 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05c72c11de2e122a295b7ae1b6ebc604d5d981e53181835892c218179020e66a" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.552827 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.552800724 podStartE2EDuration="552.800724ms" podCreationTimestamp="2025-12-04 09:43:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:28.492530221 +0000 UTC m=+54.390123974" watchObservedRunningTime="2025-12-04 09:43:28.552800724 +0000 UTC m=+54.450394477" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.565077 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c7zjm"] Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.604259 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnfsl" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.607046 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:28 crc kubenswrapper[4693]: E1204 09:43:28.607334 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:29.107319271 +0000 UTC m=+55.004913024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.707668 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36782e8d-b271-46a5-8f96-8979022991f2-config-volume\") pod \"36782e8d-b271-46a5-8f96-8979022991f2\" (UID: \"36782e8d-b271-46a5-8f96-8979022991f2\") " Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.707775 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9dm8\" (UniqueName: \"kubernetes.io/projected/36782e8d-b271-46a5-8f96-8979022991f2-kube-api-access-g9dm8\") pod \"36782e8d-b271-46a5-8f96-8979022991f2\" (UID: \"36782e8d-b271-46a5-8f96-8979022991f2\") " Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.707793 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36782e8d-b271-46a5-8f96-8979022991f2-secret-volume\") pod \"36782e8d-b271-46a5-8f96-8979022991f2\" (UID: \"36782e8d-b271-46a5-8f96-8979022991f2\") " Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.707903 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.718517 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36782e8d-b271-46a5-8f96-8979022991f2-kube-api-access-g9dm8" (OuterVolumeSpecName: "kube-api-access-g9dm8") pod "36782e8d-b271-46a5-8f96-8979022991f2" (UID: "36782e8d-b271-46a5-8f96-8979022991f2"). InnerVolumeSpecName "kube-api-access-g9dm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:43:28 crc kubenswrapper[4693]: E1204 09:43:28.718592 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:29.218576847 +0000 UTC m=+55.116170600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.719221 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36782e8d-b271-46a5-8f96-8979022991f2-config-volume" (OuterVolumeSpecName: "config-volume") pod "36782e8d-b271-46a5-8f96-8979022991f2" (UID: "36782e8d-b271-46a5-8f96-8979022991f2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.720641 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5rl2n"] Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.724846 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36782e8d-b271-46a5-8f96-8979022991f2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "36782e8d-b271-46a5-8f96-8979022991f2" (UID: "36782e8d-b271-46a5-8f96-8979022991f2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.809806 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.809896 4693 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36782e8d-b271-46a5-8f96-8979022991f2-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.809906 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9dm8\" (UniqueName: \"kubernetes.io/projected/36782e8d-b271-46a5-8f96-8979022991f2-kube-api-access-g9dm8\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.809916 4693 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36782e8d-b271-46a5-8f96-8979022991f2-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:28 crc kubenswrapper[4693]: E1204 09:43:28.810184 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:29.310173842 +0000 UTC m=+55.207767595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.819664 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l7q9x"] Dec 04 09:43:28 crc kubenswrapper[4693]: W1204 09:43:28.875570 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-3a33a6f9b4ae32a6ec14ae82c46bd4c5215156bec283aaa51fd0d04ac605507a WatchSource:0}: Error finding container 3a33a6f9b4ae32a6ec14ae82c46bd4c5215156bec283aaa51fd0d04ac605507a: Status 404 returned error can't find the container with id 3a33a6f9b4ae32a6ec14ae82c46bd4c5215156bec283aaa51fd0d04ac605507a Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.911481 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:28 crc kubenswrapper[4693]: E1204 09:43:28.911826 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:29.411812418 +0000 UTC m=+55.309406171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:28 crc kubenswrapper[4693]: I1204 09:43:28.992787 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 04 09:43:29 crc kubenswrapper[4693]: W1204 09:43:29.006020 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod49094e9f_a512_4b40_9dd8_d3b77873cb71.slice/crio-ec8d819a83ea3381c68ff770b1b0ee8fdf135598381261a8d90c57ed7c5ba191 WatchSource:0}: Error finding container ec8d819a83ea3381c68ff770b1b0ee8fdf135598381261a8d90c57ed7c5ba191: Status 404 returned error can't find the container with id ec8d819a83ea3381c68ff770b1b0ee8fdf135598381261a8d90c57ed7c5ba191 Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.012422 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:29 crc kubenswrapper[4693]: E1204 09:43:29.012684 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:29.512672693 +0000 UTC m=+55.410266446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.114195 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:29 crc kubenswrapper[4693]: E1204 09:43:29.114585 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:29.614570605 +0000 UTC m=+55.512164358 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.134221 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nnfsl"] Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.178608 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 04 09:43:29 crc kubenswrapper[4693]: E1204 09:43:29.178800 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36782e8d-b271-46a5-8f96-8979022991f2" containerName="collect-profiles" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.178811 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="36782e8d-b271-46a5-8f96-8979022991f2" containerName="collect-profiles" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.178897 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="36782e8d-b271-46a5-8f96-8979022991f2" containerName="collect-profiles" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.179195 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.183806 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.183963 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.191836 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.217723 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:29 crc kubenswrapper[4693]: E1204 09:43:29.218127 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:29.718104682 +0000 UTC m=+55.615698435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.318998 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.319305 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d96b97e7-545b-4a19-8438-1886ea643ea3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d96b97e7-545b-4a19-8438-1886ea643ea3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:43:29 crc kubenswrapper[4693]: E1204 09:43:29.319507 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:29.819473581 +0000 UTC m=+55.717067334 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.319662 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.319795 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d96b97e7-545b-4a19-8438-1886ea643ea3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d96b97e7-545b-4a19-8438-1886ea643ea3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:43:29 crc kubenswrapper[4693]: E1204 09:43:29.320197 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:29.82017487 +0000 UTC m=+55.717768663 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.420856 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:29 crc kubenswrapper[4693]: E1204 09:43:29.421045 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:29.921020184 +0000 UTC m=+55.818613937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.421119 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.421144 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d96b97e7-545b-4a19-8438-1886ea643ea3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d96b97e7-545b-4a19-8438-1886ea643ea3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.421217 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d96b97e7-545b-4a19-8438-1886ea643ea3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d96b97e7-545b-4a19-8438-1886ea643ea3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.421284 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d96b97e7-545b-4a19-8438-1886ea643ea3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d96b97e7-545b-4a19-8438-1886ea643ea3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:43:29 crc kubenswrapper[4693]: E1204 09:43:29.421586 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:29.92157294 +0000 UTC m=+55.819166693 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.438092 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d96b97e7-545b-4a19-8438-1886ea643ea3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d96b97e7-545b-4a19-8438-1886ea643ea3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.503578 4693 patch_prober.go:28] interesting pod/router-default-5444994796-55r8b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:43:29 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Dec 04 09:43:29 crc kubenswrapper[4693]: [+]process-running ok Dec 04 09:43:29 crc kubenswrapper[4693]: healthz check failed Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.503649 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55r8b" podUID="833b50c0-572f-4534-8a96-af514ff81953" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.504959 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.522820 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:29 crc kubenswrapper[4693]: E1204 09:43:29.523010 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:30.022985049 +0000 UTC m=+55.920578792 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.523527 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:29 crc kubenswrapper[4693]: E1204 09:43:29.523873 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:30.023861573 +0000 UTC m=+55.921455326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.524164 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2d6g4"] Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.525116 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2d6g4" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.532725 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.546034 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.556657 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rl2n" event={"ID":"2bedbed9-581a-414f-a92c-fc4933fcac93","Type":"ContainerStarted","Data":"2fc69693333ae9fb07ef965970a82cf7d6b6d46a9657cd58a686237d9781cac5"} Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.558948 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a9567c9b909b77e1efc5f7981f486bc7d6f2b66e0c521636007a236721190910"} Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.560094 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7q9x" event={"ID":"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5","Type":"ContainerStarted","Data":"2c8820b01aeb9bc55a776386278d14b4ee51ab10cdf62db5737c7c53c0644022"} Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.562245 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7zjm" event={"ID":"a7c911e8-e91f-4c2d-9c38-d6bce33a819f","Type":"ContainerStarted","Data":"9e4d794da92ccafa900c741e98d3f2b0b53af66fb7b77d4a13ac64f9e8c14b6c"} Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.563078 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3a33a6f9b4ae32a6ec14ae82c46bd4c5215156bec283aaa51fd0d04ac605507a"} Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.565360 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnfsl" event={"ID":"ac8a0ee1-b340-421c-8496-74757e180a20","Type":"ContainerStarted","Data":"2ae632fe1c4f4979c0c949743ec3a04a2eb9492d80367d94757a2a8159031e88"} Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.579034 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"49094e9f-a512-4b40-9dd8-d3b77873cb71","Type":"ContainerStarted","Data":"ec8d819a83ea3381c68ff770b1b0ee8fdf135598381261a8d90c57ed7c5ba191"} Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.579116 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.583214 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2d6g4"] Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.589764 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zkx54" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.616976 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-tzx57" podStartSLOduration=34.616960459 podStartE2EDuration="34.616960459s" podCreationTimestamp="2025-12-04 09:42:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:29.615619343 +0000 UTC m=+55.513213086" watchObservedRunningTime="2025-12-04 09:43:29.616960459 +0000 UTC m=+55.514554212" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.625144 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.625474 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7070356d-e89a-4f1c-a247-051bf520ae02-utilities\") pod \"redhat-marketplace-2d6g4\" (UID: \"7070356d-e89a-4f1c-a247-051bf520ae02\") " pod="openshift-marketplace/redhat-marketplace-2d6g4" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.625529 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmjxh\" (UniqueName: \"kubernetes.io/projected/7070356d-e89a-4f1c-a247-051bf520ae02-kube-api-access-bmjxh\") pod \"redhat-marketplace-2d6g4\" (UID: \"7070356d-e89a-4f1c-a247-051bf520ae02\") " pod="openshift-marketplace/redhat-marketplace-2d6g4" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.625552 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7070356d-e89a-4f1c-a247-051bf520ae02-catalog-content\") pod \"redhat-marketplace-2d6g4\" (UID: \"7070356d-e89a-4f1c-a247-051bf520ae02\") " pod="openshift-marketplace/redhat-marketplace-2d6g4" Dec 04 09:43:29 crc kubenswrapper[4693]: E1204 09:43:29.625686 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:30.125671343 +0000 UTC m=+56.023265106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.727040 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.727195 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7070356d-e89a-4f1c-a247-051bf520ae02-utilities\") pod \"redhat-marketplace-2d6g4\" (UID: \"7070356d-e89a-4f1c-a247-051bf520ae02\") " pod="openshift-marketplace/redhat-marketplace-2d6g4" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.727261 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmjxh\" (UniqueName: \"kubernetes.io/projected/7070356d-e89a-4f1c-a247-051bf520ae02-kube-api-access-bmjxh\") pod \"redhat-marketplace-2d6g4\" (UID: \"7070356d-e89a-4f1c-a247-051bf520ae02\") " pod="openshift-marketplace/redhat-marketplace-2d6g4" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.727300 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7070356d-e89a-4f1c-a247-051bf520ae02-catalog-content\") pod \"redhat-marketplace-2d6g4\" (UID: \"7070356d-e89a-4f1c-a247-051bf520ae02\") " pod="openshift-marketplace/redhat-marketplace-2d6g4" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.727708 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7070356d-e89a-4f1c-a247-051bf520ae02-catalog-content\") pod \"redhat-marketplace-2d6g4\" (UID: \"7070356d-e89a-4f1c-a247-051bf520ae02\") " pod="openshift-marketplace/redhat-marketplace-2d6g4" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.728272 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7070356d-e89a-4f1c-a247-051bf520ae02-utilities\") pod \"redhat-marketplace-2d6g4\" (UID: \"7070356d-e89a-4f1c-a247-051bf520ae02\") " pod="openshift-marketplace/redhat-marketplace-2d6g4" Dec 04 09:43:29 crc kubenswrapper[4693]: E1204 09:43:29.729214 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:30.22919798 +0000 UTC m=+56.126791723 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.746151 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmjxh\" (UniqueName: \"kubernetes.io/projected/7070356d-e89a-4f1c-a247-051bf520ae02-kube-api-access-bmjxh\") pod \"redhat-marketplace-2d6g4\" (UID: \"7070356d-e89a-4f1c-a247-051bf520ae02\") " pod="openshift-marketplace/redhat-marketplace-2d6g4" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.828005 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:29 crc kubenswrapper[4693]: E1204 09:43:29.828382 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:30.328365349 +0000 UTC m=+56.225959102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.857767 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.903189 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2d6g4" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.929828 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:29 crc kubenswrapper[4693]: E1204 09:43:29.930276 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:30.430253982 +0000 UTC m=+56.327847745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.934857 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqwg"] Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.936138 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdqwg" Dec 04 09:43:29 crc kubenswrapper[4693]: I1204 09:43:29.953659 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqwg"] Dec 04 09:43:30 crc kubenswrapper[4693]: E1204 09:43:30.030806 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:30.530775428 +0000 UTC m=+56.428369201 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.030590 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.032111 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.032204 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de8930f8-8514-4179-a3b6-3408199d5cd8-utilities\") pod \"redhat-marketplace-mdqwg\" (UID: \"de8930f8-8514-4179-a3b6-3408199d5cd8\") " pod="openshift-marketplace/redhat-marketplace-mdqwg" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.032245 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de8930f8-8514-4179-a3b6-3408199d5cd8-catalog-content\") pod \"redhat-marketplace-mdqwg\" (UID: \"de8930f8-8514-4179-a3b6-3408199d5cd8\") " pod="openshift-marketplace/redhat-marketplace-mdqwg" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.032273 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m2pw\" (UniqueName: \"kubernetes.io/projected/de8930f8-8514-4179-a3b6-3408199d5cd8-kube-api-access-5m2pw\") pod \"redhat-marketplace-mdqwg\" (UID: \"de8930f8-8514-4179-a3b6-3408199d5cd8\") " pod="openshift-marketplace/redhat-marketplace-mdqwg" Dec 04 09:43:30 crc kubenswrapper[4693]: E1204 09:43:30.032761 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:30.532749221 +0000 UTC m=+56.430343064 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.038690 4693 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.136820 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.137364 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de8930f8-8514-4179-a3b6-3408199d5cd8-utilities\") pod \"redhat-marketplace-mdqwg\" (UID: \"de8930f8-8514-4179-a3b6-3408199d5cd8\") " pod="openshift-marketplace/redhat-marketplace-mdqwg" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.137387 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de8930f8-8514-4179-a3b6-3408199d5cd8-catalog-content\") pod \"redhat-marketplace-mdqwg\" (UID: \"de8930f8-8514-4179-a3b6-3408199d5cd8\") " pod="openshift-marketplace/redhat-marketplace-mdqwg" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.137408 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m2pw\" (UniqueName: \"kubernetes.io/projected/de8930f8-8514-4179-a3b6-3408199d5cd8-kube-api-access-5m2pw\") pod \"redhat-marketplace-mdqwg\" (UID: \"de8930f8-8514-4179-a3b6-3408199d5cd8\") " pod="openshift-marketplace/redhat-marketplace-mdqwg" Dec 04 09:43:30 crc kubenswrapper[4693]: E1204 09:43:30.137811 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:30.637797559 +0000 UTC m=+56.535391312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.138173 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de8930f8-8514-4179-a3b6-3408199d5cd8-utilities\") pod \"redhat-marketplace-mdqwg\" (UID: \"de8930f8-8514-4179-a3b6-3408199d5cd8\") " pod="openshift-marketplace/redhat-marketplace-mdqwg" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.138417 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de8930f8-8514-4179-a3b6-3408199d5cd8-catalog-content\") pod \"redhat-marketplace-mdqwg\" (UID: \"de8930f8-8514-4179-a3b6-3408199d5cd8\") " pod="openshift-marketplace/redhat-marketplace-mdqwg" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.156494 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2d6g4"] Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.159026 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m2pw\" (UniqueName: \"kubernetes.io/projected/de8930f8-8514-4179-a3b6-3408199d5cd8-kube-api-access-5m2pw\") pod \"redhat-marketplace-mdqwg\" (UID: \"de8930f8-8514-4179-a3b6-3408199d5cd8\") " pod="openshift-marketplace/redhat-marketplace-mdqwg" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.239111 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:30 crc kubenswrapper[4693]: E1204 09:43:30.239527 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:30.739511937 +0000 UTC m=+56.637105700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.254349 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdqwg" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.325954 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wgqzj"] Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.327293 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgqzj" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.331025 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.342604 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:30 crc kubenswrapper[4693]: E1204 09:43:30.342771 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:30.842749486 +0000 UTC m=+56.740343239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.342849 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:30 crc kubenswrapper[4693]: E1204 09:43:30.343170 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:30.843160197 +0000 UTC m=+56.740753950 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.347737 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wgqzj"] Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.444463 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.444622 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bptj\" (UniqueName: \"kubernetes.io/projected/7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a-kube-api-access-5bptj\") pod \"redhat-operators-wgqzj\" (UID: \"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a\") " pod="openshift-marketplace/redhat-operators-wgqzj" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.444714 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a-catalog-content\") pod \"redhat-operators-wgqzj\" (UID: \"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a\") " pod="openshift-marketplace/redhat-operators-wgqzj" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.444758 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a-utilities\") pod \"redhat-operators-wgqzj\" (UID: \"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a\") " pod="openshift-marketplace/redhat-operators-wgqzj" Dec 04 09:43:30 crc kubenswrapper[4693]: E1204 09:43:30.444897 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:30.944880975 +0000 UTC m=+56.842474728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.457096 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqwg"] Dec 04 09:43:30 crc kubenswrapper[4693]: W1204 09:43:30.460535 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde8930f8_8514_4179_a3b6_3408199d5cd8.slice/crio-895add6ef70cf2ce83e6db1ef6d5dacb17317a0d27166c401644494af2c9bfbc WatchSource:0}: Error finding container 895add6ef70cf2ce83e6db1ef6d5dacb17317a0d27166c401644494af2c9bfbc: Status 404 returned error can't find the container with id 895add6ef70cf2ce83e6db1ef6d5dacb17317a0d27166c401644494af2c9bfbc Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.502105 4693 patch_prober.go:28] interesting pod/router-default-5444994796-55r8b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:43:30 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Dec 04 09:43:30 crc kubenswrapper[4693]: [+]process-running ok Dec 04 09:43:30 crc kubenswrapper[4693]: healthz check failed Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.502170 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55r8b" podUID="833b50c0-572f-4534-8a96-af514ff81953" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.522008 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sjp76"] Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.522932 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sjp76" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.533771 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sjp76"] Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.545783 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a-catalog-content\") pod \"redhat-operators-wgqzj\" (UID: \"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a\") " pod="openshift-marketplace/redhat-operators-wgqzj" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.545845 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a-utilities\") pod \"redhat-operators-wgqzj\" (UID: \"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a\") " pod="openshift-marketplace/redhat-operators-wgqzj" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.545886 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bptj\" (UniqueName: \"kubernetes.io/projected/7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a-kube-api-access-5bptj\") pod \"redhat-operators-wgqzj\" (UID: \"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a\") " pod="openshift-marketplace/redhat-operators-wgqzj" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.545954 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:30 crc kubenswrapper[4693]: E1204 09:43:30.546451 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:43:31.046436608 +0000 UTC m=+56.944030361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mnzz8" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.546683 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a-catalog-content\") pod \"redhat-operators-wgqzj\" (UID: \"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a\") " pod="openshift-marketplace/redhat-operators-wgqzj" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.546889 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a-utilities\") pod \"redhat-operators-wgqzj\" (UID: \"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a\") " pod="openshift-marketplace/redhat-operators-wgqzj" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.560489 4693 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-04T09:43:30.038723071Z","Handler":null,"Name":""} Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.564769 4693 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.564811 4693 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.577310 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bptj\" (UniqueName: \"kubernetes.io/projected/7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a-kube-api-access-5bptj\") pod \"redhat-operators-wgqzj\" (UID: \"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a\") " pod="openshift-marketplace/redhat-operators-wgqzj" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.584589 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqwg" event={"ID":"de8930f8-8514-4179-a3b6-3408199d5cd8","Type":"ContainerStarted","Data":"895add6ef70cf2ce83e6db1ef6d5dacb17317a0d27166c401644494af2c9bfbc"} Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.586209 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7zjm" event={"ID":"a7c911e8-e91f-4c2d-9c38-d6bce33a819f","Type":"ContainerStarted","Data":"32b7a15cdff06a3f3e0c8efb8af5ac3b758859eb6a1fe35456753a77ab784cf1"} Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.587107 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d96b97e7-545b-4a19-8438-1886ea643ea3","Type":"ContainerStarted","Data":"db53d3c49b2922ee0fe7a3198409fdfcde4edf8bc40f60f6f52112d982c3ad48"} Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.597240 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3666dd8b9c5fcae82b82fce4c3f3e780524188ec291376b66b8d54cc637e75de"} Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.601614 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" event={"ID":"bccb2393-5218-4cd0-9b7e-c9d19eab391b","Type":"ContainerStarted","Data":"f62616d48926e13dd0f6a561e3b2b3b97b7498417fd2293bce04e815876fa9fa"} Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.602732 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d6g4" event={"ID":"7070356d-e89a-4f1c-a247-051bf520ae02","Type":"ContainerStarted","Data":"bb36fb658639b087df9af228a79c625eae9a86a09ce8b379ca20a4d9a507fc5b"} Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.646580 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.647189 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx856\" (UniqueName: \"kubernetes.io/projected/aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f-kube-api-access-bx856\") pod \"redhat-operators-sjp76\" (UID: \"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f\") " pod="openshift-marketplace/redhat-operators-sjp76" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.647293 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f-utilities\") pod \"redhat-operators-sjp76\" (UID: \"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f\") " pod="openshift-marketplace/redhat-operators-sjp76" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.647465 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f-catalog-content\") pod \"redhat-operators-sjp76\" (UID: \"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f\") " pod="openshift-marketplace/redhat-operators-sjp76" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.652554 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgqzj" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.685165 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.749238 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f-catalog-content\") pod \"redhat-operators-sjp76\" (UID: \"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f\") " pod="openshift-marketplace/redhat-operators-sjp76" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.749782 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f-catalog-content\") pod \"redhat-operators-sjp76\" (UID: \"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f\") " pod="openshift-marketplace/redhat-operators-sjp76" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.749844 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx856\" (UniqueName: \"kubernetes.io/projected/aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f-kube-api-access-bx856\") pod \"redhat-operators-sjp76\" (UID: \"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f\") " pod="openshift-marketplace/redhat-operators-sjp76" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.749891 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.749937 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f-utilities\") pod \"redhat-operators-sjp76\" (UID: \"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f\") " pod="openshift-marketplace/redhat-operators-sjp76" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.750232 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f-utilities\") pod \"redhat-operators-sjp76\" (UID: \"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f\") " pod="openshift-marketplace/redhat-operators-sjp76" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.775320 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx856\" (UniqueName: \"kubernetes.io/projected/aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f-kube-api-access-bx856\") pod \"redhat-operators-sjp76\" (UID: \"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f\") " pod="openshift-marketplace/redhat-operators-sjp76" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.809891 4693 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.809942 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.847542 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sjp76" Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.856168 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wgqzj"] Dec 04 09:43:30 crc kubenswrapper[4693]: I1204 09:43:30.861530 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mnzz8\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:31 crc kubenswrapper[4693]: I1204 09:43:31.059885 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sjp76"] Dec 04 09:43:31 crc kubenswrapper[4693]: W1204 09:43:31.068247 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee4fe2f_0b17_4ff0_9f0c_9c7ae6b8ad6f.slice/crio-8a85b51cff4f4721346609bfd5619a22eba5b03bc2aa0b1d7e316a0116a6c7a7 WatchSource:0}: Error finding container 8a85b51cff4f4721346609bfd5619a22eba5b03bc2aa0b1d7e316a0116a6c7a7: Status 404 returned error can't find the container with id 8a85b51cff4f4721346609bfd5619a22eba5b03bc2aa0b1d7e316a0116a6c7a7 Dec 04 09:43:31 crc kubenswrapper[4693]: I1204 09:43:31.071009 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:31 crc kubenswrapper[4693]: I1204 09:43:31.261484 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mnzz8"] Dec 04 09:43:31 crc kubenswrapper[4693]: W1204 09:43:31.273656 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ce889a4_48b5_429d_8d0e_fc270a53385b.slice/crio-c800ca3f53b9611a85ce21d51b60eacc2b88b6526a6b6e1062acb2bf57f42427 WatchSource:0}: Error finding container c800ca3f53b9611a85ce21d51b60eacc2b88b6526a6b6e1062acb2bf57f42427: Status 404 returned error can't find the container with id c800ca3f53b9611a85ce21d51b60eacc2b88b6526a6b6e1062acb2bf57f42427 Dec 04 09:43:31 crc kubenswrapper[4693]: I1204 09:43:31.501921 4693 patch_prober.go:28] interesting pod/router-default-5444994796-55r8b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:43:31 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Dec 04 09:43:31 crc kubenswrapper[4693]: [+]process-running ok Dec 04 09:43:31 crc kubenswrapper[4693]: healthz check failed Dec 04 09:43:31 crc kubenswrapper[4693]: I1204 09:43:31.501978 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55r8b" podUID="833b50c0-572f-4534-8a96-af514ff81953" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:43:31 crc kubenswrapper[4693]: I1204 09:43:31.623326 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"49094e9f-a512-4b40-9dd8-d3b77873cb71","Type":"ContainerStarted","Data":"1fb61a0e459615374c99ae3e708a32e503dd290423de80cbff9b3785952abb36"} Dec 04 09:43:31 crc kubenswrapper[4693]: I1204 09:43:31.625182 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rl2n" event={"ID":"2bedbed9-581a-414f-a92c-fc4933fcac93","Type":"ContainerStarted","Data":"931f810d664bed92f796b0aa180fecaa0bdb98e541abddd20e76338315221e79"} Dec 04 09:43:31 crc kubenswrapper[4693]: I1204 09:43:31.627096 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a9e6fa4808fb6c6da4ca20023eeafabe40e8d811595ed98b2dbe225d41461910"} Dec 04 09:43:31 crc kubenswrapper[4693]: I1204 09:43:31.627993 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjp76" event={"ID":"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f","Type":"ContainerStarted","Data":"8a85b51cff4f4721346609bfd5619a22eba5b03bc2aa0b1d7e316a0116a6c7a7"} Dec 04 09:43:31 crc kubenswrapper[4693]: I1204 09:43:31.630695 4693 generic.go:334] "Generic (PLEG): container finished" podID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" containerID="32b7a15cdff06a3f3e0c8efb8af5ac3b758859eb6a1fe35456753a77ab784cf1" exitCode=0 Dec 04 09:43:31 crc kubenswrapper[4693]: I1204 09:43:31.630764 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7zjm" event={"ID":"a7c911e8-e91f-4c2d-9c38-d6bce33a819f","Type":"ContainerDied","Data":"32b7a15cdff06a3f3e0c8efb8af5ac3b758859eb6a1fe35456753a77ab784cf1"} Dec 04 09:43:31 crc kubenswrapper[4693]: I1204 09:43:31.632568 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgqzj" event={"ID":"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a","Type":"ContainerStarted","Data":"10787858b82d57fb7f87b84846a666a2034b611b52ab3f89204097bc6d1281fa"} Dec 04 09:43:31 crc kubenswrapper[4693]: I1204 09:43:31.635343 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a7cc948b42e6a0040cfd2f8e20225ba9d5c2be2d984f98ce8ac844d3f60bc800"} Dec 04 09:43:31 crc kubenswrapper[4693]: I1204 09:43:31.639998 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7q9x" event={"ID":"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5","Type":"ContainerStarted","Data":"74211032aea3ab9631ae47bbcfc6896290d74c0d6e2951ff1edb5bc8497ee5a0"} Dec 04 09:43:31 crc kubenswrapper[4693]: I1204 09:43:31.641485 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" event={"ID":"3ce889a4-48b5-429d-8d0e-fc270a53385b","Type":"ContainerStarted","Data":"c800ca3f53b9611a85ce21d51b60eacc2b88b6526a6b6e1062acb2bf57f42427"} Dec 04 09:43:32 crc kubenswrapper[4693]: I1204 09:43:32.196431 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wjxqk" Dec 04 09:43:32 crc kubenswrapper[4693]: I1204 09:43:32.471910 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 04 09:43:32 crc kubenswrapper[4693]: I1204 09:43:32.501482 4693 patch_prober.go:28] interesting pod/router-default-5444994796-55r8b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:43:32 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Dec 04 09:43:32 crc kubenswrapper[4693]: [+]process-running ok Dec 04 09:43:32 crc kubenswrapper[4693]: healthz check failed Dec 04 09:43:32 crc kubenswrapper[4693]: I1204 09:43:32.501529 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55r8b" podUID="833b50c0-572f-4534-8a96-af514ff81953" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:43:32 crc kubenswrapper[4693]: I1204 09:43:32.648686 4693 generic.go:334] "Generic (PLEG): container finished" podID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" containerID="74211032aea3ab9631ae47bbcfc6896290d74c0d6e2951ff1edb5bc8497ee5a0" exitCode=0 Dec 04 09:43:32 crc kubenswrapper[4693]: I1204 09:43:32.648772 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7q9x" event={"ID":"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5","Type":"ContainerDied","Data":"74211032aea3ab9631ae47bbcfc6896290d74c0d6e2951ff1edb5bc8497ee5a0"} Dec 04 09:43:32 crc kubenswrapper[4693]: I1204 09:43:32.650325 4693 generic.go:334] "Generic (PLEG): container finished" podID="ac8a0ee1-b340-421c-8496-74757e180a20" containerID="ed5795b6e212f2841ff5cdc30fff5e5902df2537d91390ec4e25ef4164d3ac9d" exitCode=0 Dec 04 09:43:32 crc kubenswrapper[4693]: I1204 09:43:32.650405 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnfsl" event={"ID":"ac8a0ee1-b340-421c-8496-74757e180a20","Type":"ContainerDied","Data":"ed5795b6e212f2841ff5cdc30fff5e5902df2537d91390ec4e25ef4164d3ac9d"} Dec 04 09:43:32 crc kubenswrapper[4693]: I1204 09:43:32.651636 4693 generic.go:334] "Generic (PLEG): container finished" podID="2bedbed9-581a-414f-a92c-fc4933fcac93" containerID="931f810d664bed92f796b0aa180fecaa0bdb98e541abddd20e76338315221e79" exitCode=0 Dec 04 09:43:32 crc kubenswrapper[4693]: I1204 09:43:32.652494 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rl2n" event={"ID":"2bedbed9-581a-414f-a92c-fc4933fcac93","Type":"ContainerDied","Data":"931f810d664bed92f796b0aa180fecaa0bdb98e541abddd20e76338315221e79"} Dec 04 09:43:32 crc kubenswrapper[4693]: I1204 09:43:32.652534 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:43:33 crc kubenswrapper[4693]: I1204 09:43:33.169979 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:43:33 crc kubenswrapper[4693]: I1204 09:43:33.282971 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-sg997 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Dec 04 09:43:33 crc kubenswrapper[4693]: I1204 09:43:33.283023 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sg997" podUID="57ffaa1b-fb6e-4c7b-8809-ca9bbd8f527e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Dec 04 09:43:33 crc kubenswrapper[4693]: I1204 09:43:33.291421 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-sg997 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Dec 04 09:43:33 crc kubenswrapper[4693]: I1204 09:43:33.291469 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-sg997" podUID="57ffaa1b-fb6e-4c7b-8809-ca9bbd8f527e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Dec 04 09:43:33 crc kubenswrapper[4693]: I1204 09:43:33.421767 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pstsq" Dec 04 09:43:33 crc kubenswrapper[4693]: I1204 09:43:33.468784 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-zjrqj" Dec 04 09:43:33 crc kubenswrapper[4693]: I1204 09:43:33.502111 4693 patch_prober.go:28] interesting pod/router-default-5444994796-55r8b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:43:33 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Dec 04 09:43:33 crc kubenswrapper[4693]: [+]process-running ok Dec 04 09:43:33 crc kubenswrapper[4693]: healthz check failed Dec 04 09:43:33 crc kubenswrapper[4693]: I1204 09:43:33.502163 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55r8b" podUID="833b50c0-572f-4534-8a96-af514ff81953" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:43:33 crc kubenswrapper[4693]: I1204 09:43:33.613479 4693 patch_prober.go:28] interesting pod/console-f9d7485db-mhzjn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Dec 04 09:43:33 crc kubenswrapper[4693]: I1204 09:43:33.613547 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-mhzjn" podUID="18edbe10-dd1c-47a9-b8de-5f2d53306f2e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" Dec 04 09:43:33 crc kubenswrapper[4693]: I1204 09:43:33.657147 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqwg" event={"ID":"de8930f8-8514-4179-a3b6-3408199d5cd8","Type":"ContainerStarted","Data":"b659b33da8fc2ed3060af69e984702c33007de096e9d80243a0013164a591183"} Dec 04 09:43:33 crc kubenswrapper[4693]: I1204 09:43:33.910132 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:33 crc kubenswrapper[4693]: I1204 09:43:33.910518 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:33 crc kubenswrapper[4693]: I1204 09:43:33.928628 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:34 crc kubenswrapper[4693]: E1204 09:43:34.178818 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:43:34 crc kubenswrapper[4693]: E1204 09:43:34.182322 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:43:34 crc kubenswrapper[4693]: E1204 09:43:34.186467 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:43:34 crc kubenswrapper[4693]: E1204 09:43:34.186527 4693 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" podUID="d90f3655-18d4-4dec-b9d4-7309fa424c4e" containerName="kube-multus-additional-cni-plugins" Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.502301 4693 patch_prober.go:28] interesting pod/router-default-5444994796-55r8b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:43:34 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Dec 04 09:43:34 crc kubenswrapper[4693]: [+]process-running ok Dec 04 09:43:34 crc kubenswrapper[4693]: healthz check failed Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.502376 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55r8b" podUID="833b50c0-572f-4534-8a96-af514ff81953" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.664976 4693 generic.go:334] "Generic (PLEG): container finished" podID="de8930f8-8514-4179-a3b6-3408199d5cd8" containerID="b659b33da8fc2ed3060af69e984702c33007de096e9d80243a0013164a591183" exitCode=0 Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.666057 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqwg" event={"ID":"de8930f8-8514-4179-a3b6-3408199d5cd8","Type":"ContainerDied","Data":"b659b33da8fc2ed3060af69e984702c33007de096e9d80243a0013164a591183"} Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.667451 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.668599 4693 generic.go:334] "Generic (PLEG): container finished" podID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" containerID="5bb191a79397e5aecc102a47f1eac42fb9cfb8303b765dbed3aa83562bccb61a" exitCode=0 Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.668668 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjp76" event={"ID":"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f","Type":"ContainerDied","Data":"5bb191a79397e5aecc102a47f1eac42fb9cfb8303b765dbed3aa83562bccb61a"} Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.678025 4693 generic.go:334] "Generic (PLEG): container finished" podID="d96b97e7-545b-4a19-8438-1886ea643ea3" containerID="abd4bdc9c15aec0365a373ca4725f64d5665cb464ece78d340acb80131db25ab" exitCode=0 Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.678112 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d96b97e7-545b-4a19-8438-1886ea643ea3","Type":"ContainerDied","Data":"abd4bdc9c15aec0365a373ca4725f64d5665cb464ece78d340acb80131db25ab"} Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.680445 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" event={"ID":"bccb2393-5218-4cd0-9b7e-c9d19eab391b","Type":"ContainerStarted","Data":"513fd884425b99af5b238ab82aa3f6ef39368ff71f487b0a15d6bcb36ce49454"} Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.680586 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" event={"ID":"bccb2393-5218-4cd0-9b7e-c9d19eab391b","Type":"ContainerStarted","Data":"88585b4853b6ef020553f2b30e48ce166bf6e95c2dfa90310ad7f76faac7538c"} Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.682147 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" event={"ID":"3ce889a4-48b5-429d-8d0e-fc270a53385b","Type":"ContainerStarted","Data":"3087f3281fa40b31afa9506d0bb4b2fc3ece5f3180d1339d8fe43d648f58f9ec"} Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.682815 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.686799 4693 generic.go:334] "Generic (PLEG): container finished" podID="49094e9f-a512-4b40-9dd8-d3b77873cb71" containerID="1fb61a0e459615374c99ae3e708a32e503dd290423de80cbff9b3785952abb36" exitCode=0 Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.686893 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"49094e9f-a512-4b40-9dd8-d3b77873cb71","Type":"ContainerDied","Data":"1fb61a0e459615374c99ae3e708a32e503dd290423de80cbff9b3785952abb36"} Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.689763 4693 generic.go:334] "Generic (PLEG): container finished" podID="7070356d-e89a-4f1c-a247-051bf520ae02" containerID="ed157ad9534a02a18ade935f9d1c5b2a35dcdc53e5f6d79a1fcd67b7227952e4" exitCode=0 Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.690068 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d6g4" event={"ID":"7070356d-e89a-4f1c-a247-051bf520ae02","Type":"ContainerDied","Data":"ed157ad9534a02a18ade935f9d1c5b2a35dcdc53e5f6d79a1fcd67b7227952e4"} Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.694110 4693 generic.go:334] "Generic (PLEG): container finished" podID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" containerID="e3f7ca85ddaf3f7a9327b7d4874760517c0b87134b47474d2cf4dcb5b20535b8" exitCode=0 Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.695478 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgqzj" event={"ID":"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a","Type":"ContainerDied","Data":"e3f7ca85ddaf3f7a9327b7d4874760517c0b87134b47474d2cf4dcb5b20535b8"} Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.702158 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-tzx57" Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.865405 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" podStartSLOduration=38.865388696 podStartE2EDuration="38.865388696s" podCreationTimestamp="2025-12-04 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:34.864659736 +0000 UTC m=+60.762253489" watchObservedRunningTime="2025-12-04 09:43:34.865388696 +0000 UTC m=+60.762982449" Dec 04 09:43:34 crc kubenswrapper[4693]: I1204 09:43:34.920388 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-5vvnq" podStartSLOduration=23.920371806 podStartE2EDuration="23.920371806s" podCreationTimestamp="2025-12-04 09:43:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:43:34.920096499 +0000 UTC m=+60.817690252" watchObservedRunningTime="2025-12-04 09:43:34.920371806 +0000 UTC m=+60.817965549" Dec 04 09:43:35 crc kubenswrapper[4693]: I1204 09:43:35.502852 4693 patch_prober.go:28] interesting pod/router-default-5444994796-55r8b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:43:35 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Dec 04 09:43:35 crc kubenswrapper[4693]: [+]process-running ok Dec 04 09:43:35 crc kubenswrapper[4693]: healthz check failed Dec 04 09:43:35 crc kubenswrapper[4693]: I1204 09:43:35.503143 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55r8b" podUID="833b50c0-572f-4534-8a96-af514ff81953" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:43:35 crc kubenswrapper[4693]: I1204 09:43:35.970449 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.051667 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49094e9f-a512-4b40-9dd8-d3b77873cb71-kube-api-access\") pod \"49094e9f-a512-4b40-9dd8-d3b77873cb71\" (UID: \"49094e9f-a512-4b40-9dd8-d3b77873cb71\") " Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.051725 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49094e9f-a512-4b40-9dd8-d3b77873cb71-kubelet-dir\") pod \"49094e9f-a512-4b40-9dd8-d3b77873cb71\" (UID: \"49094e9f-a512-4b40-9dd8-d3b77873cb71\") " Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.051861 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49094e9f-a512-4b40-9dd8-d3b77873cb71-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "49094e9f-a512-4b40-9dd8-d3b77873cb71" (UID: "49094e9f-a512-4b40-9dd8-d3b77873cb71"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.052054 4693 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49094e9f-a512-4b40-9dd8-d3b77873cb71-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.058599 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49094e9f-a512-4b40-9dd8-d3b77873cb71-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "49094e9f-a512-4b40-9dd8-d3b77873cb71" (UID: "49094e9f-a512-4b40-9dd8-d3b77873cb71"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.061385 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.153102 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d96b97e7-545b-4a19-8438-1886ea643ea3-kubelet-dir\") pod \"d96b97e7-545b-4a19-8438-1886ea643ea3\" (UID: \"d96b97e7-545b-4a19-8438-1886ea643ea3\") " Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.153216 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d96b97e7-545b-4a19-8438-1886ea643ea3-kube-api-access\") pod \"d96b97e7-545b-4a19-8438-1886ea643ea3\" (UID: \"d96b97e7-545b-4a19-8438-1886ea643ea3\") " Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.153491 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49094e9f-a512-4b40-9dd8-d3b77873cb71-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.153890 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d96b97e7-545b-4a19-8438-1886ea643ea3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d96b97e7-545b-4a19-8438-1886ea643ea3" (UID: "d96b97e7-545b-4a19-8438-1886ea643ea3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.161274 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d96b97e7-545b-4a19-8438-1886ea643ea3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d96b97e7-545b-4a19-8438-1886ea643ea3" (UID: "d96b97e7-545b-4a19-8438-1886ea643ea3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.254925 4693 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d96b97e7-545b-4a19-8438-1886ea643ea3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.254978 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d96b97e7-545b-4a19-8438-1886ea643ea3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.503988 4693 patch_prober.go:28] interesting pod/router-default-5444994796-55r8b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:43:36 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Dec 04 09:43:36 crc kubenswrapper[4693]: [+]process-running ok Dec 04 09:43:36 crc kubenswrapper[4693]: healthz check failed Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.504144 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55r8b" podUID="833b50c0-572f-4534-8a96-af514ff81953" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.734953 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d96b97e7-545b-4a19-8438-1886ea643ea3","Type":"ContainerDied","Data":"db53d3c49b2922ee0fe7a3198409fdfcde4edf8bc40f60f6f52112d982c3ad48"} Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.735025 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db53d3c49b2922ee0fe7a3198409fdfcde4edf8bc40f60f6f52112d982c3ad48" Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.735203 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.759068 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.759718 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"49094e9f-a512-4b40-9dd8-d3b77873cb71","Type":"ContainerDied","Data":"ec8d819a83ea3381c68ff770b1b0ee8fdf135598381261a8d90c57ed7c5ba191"} Dec 04 09:43:36 crc kubenswrapper[4693]: I1204 09:43:36.759741 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec8d819a83ea3381c68ff770b1b0ee8fdf135598381261a8d90c57ed7c5ba191" Dec 04 09:43:37 crc kubenswrapper[4693]: I1204 09:43:37.504253 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:37 crc kubenswrapper[4693]: I1204 09:43:37.506509 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-55r8b" Dec 04 09:43:43 crc kubenswrapper[4693]: I1204 09:43:43.288688 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-sg997" Dec 04 09:43:43 crc kubenswrapper[4693]: I1204 09:43:43.617575 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:43 crc kubenswrapper[4693]: I1204 09:43:43.667496 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:43:44 crc kubenswrapper[4693]: E1204 09:43:44.188364 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:43:44 crc kubenswrapper[4693]: E1204 09:43:44.197509 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:43:44 crc kubenswrapper[4693]: E1204 09:43:44.200234 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:43:44 crc kubenswrapper[4693]: E1204 09:43:44.200287 4693 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" podUID="d90f3655-18d4-4dec-b9d4-7309fa424c4e" containerName="kube-multus-additional-cni-plugins" Dec 04 09:43:51 crc kubenswrapper[4693]: I1204 09:43:51.077076 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:43:54 crc kubenswrapper[4693]: E1204 09:43:54.177957 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:43:54 crc kubenswrapper[4693]: E1204 09:43:54.179174 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:43:54 crc kubenswrapper[4693]: E1204 09:43:54.181053 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:43:54 crc kubenswrapper[4693]: E1204 09:43:54.181223 4693 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" podUID="d90f3655-18d4-4dec-b9d4-7309fa424c4e" containerName="kube-multus-additional-cni-plugins" Dec 04 09:43:54 crc kubenswrapper[4693]: I1204 09:43:54.471411 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g72br" Dec 04 09:43:58 crc kubenswrapper[4693]: I1204 09:43:58.320697 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-74srj_d90f3655-18d4-4dec-b9d4-7309fa424c4e/kube-multus-additional-cni-plugins/0.log" Dec 04 09:43:58 crc kubenswrapper[4693]: I1204 09:43:58.320892 4693 generic.go:334] "Generic (PLEG): container finished" podID="d90f3655-18d4-4dec-b9d4-7309fa424c4e" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" exitCode=137 Dec 04 09:43:58 crc kubenswrapper[4693]: I1204 09:43:58.320923 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" event={"ID":"d90f3655-18d4-4dec-b9d4-7309fa424c4e","Type":"ContainerDied","Data":"0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6"} Dec 04 09:44:04 crc kubenswrapper[4693]: E1204 09:44:04.175765 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:44:04 crc kubenswrapper[4693]: E1204 09:44:04.176594 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:44:04 crc kubenswrapper[4693]: E1204 09:44:04.177217 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:44:04 crc kubenswrapper[4693]: E1204 09:44:04.177279 4693 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" podUID="d90f3655-18d4-4dec-b9d4-7309fa424c4e" containerName="kube-multus-additional-cni-plugins" Dec 04 09:44:05 crc kubenswrapper[4693]: I1204 09:44:05.969630 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 04 09:44:05 crc kubenswrapper[4693]: E1204 09:44:05.970289 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49094e9f-a512-4b40-9dd8-d3b77873cb71" containerName="pruner" Dec 04 09:44:05 crc kubenswrapper[4693]: I1204 09:44:05.970303 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="49094e9f-a512-4b40-9dd8-d3b77873cb71" containerName="pruner" Dec 04 09:44:05 crc kubenswrapper[4693]: E1204 09:44:05.970320 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d96b97e7-545b-4a19-8438-1886ea643ea3" containerName="pruner" Dec 04 09:44:05 crc kubenswrapper[4693]: I1204 09:44:05.970346 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d96b97e7-545b-4a19-8438-1886ea643ea3" containerName="pruner" Dec 04 09:44:05 crc kubenswrapper[4693]: I1204 09:44:05.970476 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="49094e9f-a512-4b40-9dd8-d3b77873cb71" containerName="pruner" Dec 04 09:44:05 crc kubenswrapper[4693]: I1204 09:44:05.970497 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d96b97e7-545b-4a19-8438-1886ea643ea3" containerName="pruner" Dec 04 09:44:05 crc kubenswrapper[4693]: I1204 09:44:05.970874 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:44:05 crc kubenswrapper[4693]: I1204 09:44:05.977752 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 04 09:44:05 crc kubenswrapper[4693]: I1204 09:44:05.978313 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 04 09:44:05 crc kubenswrapper[4693]: I1204 09:44:05.982639 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 04 09:44:06 crc kubenswrapper[4693]: I1204 09:44:06.048146 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67f5b86e-5873-429a-afc0-aaa888476d0e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"67f5b86e-5873-429a-afc0-aaa888476d0e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:44:06 crc kubenswrapper[4693]: I1204 09:44:06.048223 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67f5b86e-5873-429a-afc0-aaa888476d0e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"67f5b86e-5873-429a-afc0-aaa888476d0e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:44:06 crc kubenswrapper[4693]: I1204 09:44:06.149876 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67f5b86e-5873-429a-afc0-aaa888476d0e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"67f5b86e-5873-429a-afc0-aaa888476d0e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:44:06 crc kubenswrapper[4693]: I1204 09:44:06.149959 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67f5b86e-5873-429a-afc0-aaa888476d0e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"67f5b86e-5873-429a-afc0-aaa888476d0e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:44:06 crc kubenswrapper[4693]: I1204 09:44:06.150047 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67f5b86e-5873-429a-afc0-aaa888476d0e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"67f5b86e-5873-429a-afc0-aaa888476d0e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:44:06 crc kubenswrapper[4693]: I1204 09:44:06.168033 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67f5b86e-5873-429a-afc0-aaa888476d0e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"67f5b86e-5873-429a-afc0-aaa888476d0e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:44:06 crc kubenswrapper[4693]: I1204 09:44:06.298898 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:44:07 crc kubenswrapper[4693]: I1204 09:44:07.450737 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:44:08 crc kubenswrapper[4693]: I1204 09:44:08.478050 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 04 09:44:10 crc kubenswrapper[4693]: I1204 09:44:10.779939 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 04 09:44:10 crc kubenswrapper[4693]: I1204 09:44:10.781958 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:44:10 crc kubenswrapper[4693]: I1204 09:44:10.786811 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 04 09:44:10 crc kubenswrapper[4693]: I1204 09:44:10.805034 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a72ca277-b61f-4242-9e81-0bbabbffab52-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a72ca277-b61f-4242-9e81-0bbabbffab52\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:44:10 crc kubenswrapper[4693]: I1204 09:44:10.805159 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a72ca277-b61f-4242-9e81-0bbabbffab52-var-lock\") pod \"installer-9-crc\" (UID: \"a72ca277-b61f-4242-9e81-0bbabbffab52\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:44:10 crc kubenswrapper[4693]: I1204 09:44:10.805689 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a72ca277-b61f-4242-9e81-0bbabbffab52-kube-api-access\") pod \"installer-9-crc\" (UID: \"a72ca277-b61f-4242-9e81-0bbabbffab52\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:44:10 crc kubenswrapper[4693]: I1204 09:44:10.814761 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=2.814734239 podStartE2EDuration="2.814734239s" podCreationTimestamp="2025-12-04 09:44:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:44:10.806621697 +0000 UTC m=+96.704215480" watchObservedRunningTime="2025-12-04 09:44:10.814734239 +0000 UTC m=+96.712328032" Dec 04 09:44:10 crc kubenswrapper[4693]: I1204 09:44:10.906946 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a72ca277-b61f-4242-9e81-0bbabbffab52-var-lock\") pod \"installer-9-crc\" (UID: \"a72ca277-b61f-4242-9e81-0bbabbffab52\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:44:10 crc kubenswrapper[4693]: I1204 09:44:10.907102 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a72ca277-b61f-4242-9e81-0bbabbffab52-kube-api-access\") pod \"installer-9-crc\" (UID: \"a72ca277-b61f-4242-9e81-0bbabbffab52\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:44:10 crc kubenswrapper[4693]: I1204 09:44:10.907096 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a72ca277-b61f-4242-9e81-0bbabbffab52-var-lock\") pod \"installer-9-crc\" (UID: \"a72ca277-b61f-4242-9e81-0bbabbffab52\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:44:10 crc kubenswrapper[4693]: I1204 09:44:10.907167 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a72ca277-b61f-4242-9e81-0bbabbffab52-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a72ca277-b61f-4242-9e81-0bbabbffab52\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:44:10 crc kubenswrapper[4693]: I1204 09:44:10.907210 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a72ca277-b61f-4242-9e81-0bbabbffab52-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a72ca277-b61f-4242-9e81-0bbabbffab52\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:44:10 crc kubenswrapper[4693]: I1204 09:44:10.938931 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a72ca277-b61f-4242-9e81-0bbabbffab52-kube-api-access\") pod \"installer-9-crc\" (UID: \"a72ca277-b61f-4242-9e81-0bbabbffab52\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:44:11 crc kubenswrapper[4693]: I1204 09:44:11.112111 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:44:14 crc kubenswrapper[4693]: E1204 09:44:14.176435 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:44:14 crc kubenswrapper[4693]: E1204 09:44:14.177228 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:44:14 crc kubenswrapper[4693]: E1204 09:44:14.177692 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:44:14 crc kubenswrapper[4693]: E1204 09:44:14.177728 4693 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" podUID="d90f3655-18d4-4dec-b9d4-7309fa424c4e" containerName="kube-multus-additional-cni-plugins" Dec 04 09:44:24 crc kubenswrapper[4693]: E1204 09:44:24.176396 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:44:24 crc kubenswrapper[4693]: E1204 09:44:24.178203 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:44:24 crc kubenswrapper[4693]: E1204 09:44:24.178816 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:44:24 crc kubenswrapper[4693]: E1204 09:44:24.178880 4693 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" podUID="d90f3655-18d4-4dec-b9d4-7309fa424c4e" containerName="kube-multus-additional-cni-plugins" Dec 04 09:44:34 crc kubenswrapper[4693]: E1204 09:44:34.175596 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:44:34 crc kubenswrapper[4693]: E1204 09:44:34.176478 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:44:34 crc kubenswrapper[4693]: E1204 09:44:34.176885 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:44:34 crc kubenswrapper[4693]: E1204 09:44:34.176918 4693 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" podUID="d90f3655-18d4-4dec-b9d4-7309fa424c4e" containerName="kube-multus-additional-cni-plugins" Dec 04 09:44:44 crc kubenswrapper[4693]: E1204 09:44:44.177138 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:44:44 crc kubenswrapper[4693]: E1204 09:44:44.178162 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:44:44 crc kubenswrapper[4693]: E1204 09:44:44.178653 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:44:44 crc kubenswrapper[4693]: E1204 09:44:44.178726 4693 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" podUID="d90f3655-18d4-4dec-b9d4-7309fa424c4e" containerName="kube-multus-additional-cni-plugins" Dec 04 09:44:52 crc kubenswrapper[4693]: E1204 09:44:52.383455 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 04 09:44:52 crc kubenswrapper[4693]: E1204 09:44:52.384196 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-99788,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-nnfsl_openshift-marketplace(ac8a0ee1-b340-421c-8496-74757e180a20): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 09:44:52 crc kubenswrapper[4693]: E1204 09:44:52.385613 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-nnfsl" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" Dec 04 09:44:52 crc kubenswrapper[4693]: E1204 09:44:52.406162 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 04 09:44:52 crc kubenswrapper[4693]: E1204 09:44:52.406351 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cf9k8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5rl2n_openshift-marketplace(2bedbed9-581a-414f-a92c-fc4933fcac93): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 09:44:52 crc kubenswrapper[4693]: E1204 09:44:52.407464 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5rl2n" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" Dec 04 09:44:54 crc kubenswrapper[4693]: E1204 09:44:54.175391 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:44:54 crc kubenswrapper[4693]: E1204 09:44:54.176049 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:44:54 crc kubenswrapper[4693]: E1204 09:44:54.176345 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 09:44:54 crc kubenswrapper[4693]: E1204 09:44:54.176366 4693 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" podUID="d90f3655-18d4-4dec-b9d4-7309fa424c4e" containerName="kube-multus-additional-cni-plugins" Dec 04 09:44:55 crc kubenswrapper[4693]: E1204 09:44:55.550831 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 04 09:44:55 crc kubenswrapper[4693]: E1204 09:44:55.551006 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5bptj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-wgqzj_openshift-marketplace(7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 09:44:55 crc kubenswrapper[4693]: E1204 09:44:55.552175 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-wgqzj" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" Dec 04 09:44:57 crc kubenswrapper[4693]: E1204 09:44:57.239702 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-wgqzj" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" Dec 04 09:44:57 crc kubenswrapper[4693]: E1204 09:44:57.316188 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 04 09:44:57 crc kubenswrapper[4693]: E1204 09:44:57.316397 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ssh2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-c7zjm_openshift-marketplace(a7c911e8-e91f-4c2d-9c38-d6bce33a819f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 09:44:57 crc kubenswrapper[4693]: E1204 09:44:57.317751 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-c7zjm" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" Dec 04 09:44:57 crc kubenswrapper[4693]: E1204 09:44:57.334468 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 04 09:44:57 crc kubenswrapper[4693]: E1204 09:44:57.334608 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-znt6q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-l7q9x_openshift-marketplace(6a3f510e-344a-4433-bfbb-ed6c76c3a4b5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 09:44:57 crc kubenswrapper[4693]: E1204 09:44:57.335694 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-l7q9x" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" Dec 04 09:44:58 crc kubenswrapper[4693]: E1204 09:44:58.184102 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-l7q9x" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" Dec 04 09:44:58 crc kubenswrapper[4693]: E1204 09:44:58.184305 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-c7zjm" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.246028 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-74srj_d90f3655-18d4-4dec-b9d4-7309fa424c4e/kube-multus-additional-cni-plugins/0.log" Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.246505 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" Dec 04 09:44:58 crc kubenswrapper[4693]: E1204 09:44:58.262976 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 04 09:44:58 crc kubenswrapper[4693]: E1204 09:44:58.263150 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5m2pw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mdqwg_openshift-marketplace(de8930f8-8514-4179-a3b6-3408199d5cd8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 09:44:58 crc kubenswrapper[4693]: E1204 09:44:58.266039 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mdqwg" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" Dec 04 09:44:58 crc kubenswrapper[4693]: E1204 09:44:58.307779 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 04 09:44:58 crc kubenswrapper[4693]: E1204 09:44:58.308029 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bx856,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-sjp76_openshift-marketplace(aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 09:44:58 crc kubenswrapper[4693]: E1204 09:44:58.308431 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 04 09:44:58 crc kubenswrapper[4693]: E1204 09:44:58.308503 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bmjxh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-2d6g4_openshift-marketplace(7070356d-e89a-4f1c-a247-051bf520ae02): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.308608 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/d90f3655-18d4-4dec-b9d4-7309fa424c4e-ready\") pod \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\" (UID: \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\") " Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.308762 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d90f3655-18d4-4dec-b9d4-7309fa424c4e-cni-sysctl-allowlist\") pod \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\" (UID: \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\") " Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.308967 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpr68\" (UniqueName: \"kubernetes.io/projected/d90f3655-18d4-4dec-b9d4-7309fa424c4e-kube-api-access-cpr68\") pod \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\" (UID: \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\") " Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.309018 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d90f3655-18d4-4dec-b9d4-7309fa424c4e-tuning-conf-dir\") pod \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\" (UID: \"d90f3655-18d4-4dec-b9d4-7309fa424c4e\") " Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.309578 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d90f3655-18d4-4dec-b9d4-7309fa424c4e-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "d90f3655-18d4-4dec-b9d4-7309fa424c4e" (UID: "d90f3655-18d4-4dec-b9d4-7309fa424c4e"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.310072 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d90f3655-18d4-4dec-b9d4-7309fa424c4e-ready" (OuterVolumeSpecName: "ready") pod "d90f3655-18d4-4dec-b9d4-7309fa424c4e" (UID: "d90f3655-18d4-4dec-b9d4-7309fa424c4e"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:44:58 crc kubenswrapper[4693]: E1204 09:44:58.310153 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-2d6g4" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" Dec 04 09:44:58 crc kubenswrapper[4693]: E1204 09:44:58.310197 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-sjp76" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.310800 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d90f3655-18d4-4dec-b9d4-7309fa424c4e-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "d90f3655-18d4-4dec-b9d4-7309fa424c4e" (UID: "d90f3655-18d4-4dec-b9d4-7309fa424c4e"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.320758 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d90f3655-18d4-4dec-b9d4-7309fa424c4e-kube-api-access-cpr68" (OuterVolumeSpecName: "kube-api-access-cpr68") pod "d90f3655-18d4-4dec-b9d4-7309fa424c4e" (UID: "d90f3655-18d4-4dec-b9d4-7309fa424c4e"). InnerVolumeSpecName "kube-api-access-cpr68". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.410412 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpr68\" (UniqueName: \"kubernetes.io/projected/d90f3655-18d4-4dec-b9d4-7309fa424c4e-kube-api-access-cpr68\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.410444 4693 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d90f3655-18d4-4dec-b9d4-7309fa424c4e-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.410457 4693 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/d90f3655-18d4-4dec-b9d4-7309fa424c4e-ready\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.410471 4693 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d90f3655-18d4-4dec-b9d4-7309fa424c4e-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.643903 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.686893 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 04 09:44:58 crc kubenswrapper[4693]: W1204 09:44:58.696503 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda72ca277_b61f_4242_9e81_0bbabbffab52.slice/crio-7a005066918c6819bc65b4805f02943f7c76fddeb07d51f8016414585fc85619 WatchSource:0}: Error finding container 7a005066918c6819bc65b4805f02943f7c76fddeb07d51f8016414585fc85619: Status 404 returned error can't find the container with id 7a005066918c6819bc65b4805f02943f7c76fddeb07d51f8016414585fc85619 Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.738470 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a72ca277-b61f-4242-9e81-0bbabbffab52","Type":"ContainerStarted","Data":"7a005066918c6819bc65b4805f02943f7c76fddeb07d51f8016414585fc85619"} Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.740444 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-74srj_d90f3655-18d4-4dec-b9d4-7309fa424c4e/kube-multus-additional-cni-plugins/0.log" Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.740502 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" event={"ID":"d90f3655-18d4-4dec-b9d4-7309fa424c4e","Type":"ContainerDied","Data":"35f4c03094d794f3716e26019b15ea8cc5e156f196aebe8617bbf2f86eb1e06d"} Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.740545 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-74srj" Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.740542 4693 scope.go:117] "RemoveContainer" containerID="0808cbd9b3184041c9823b389604b3fcddbea0ba558e3fe2991f7655bf49b1d6" Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.742460 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"67f5b86e-5873-429a-afc0-aaa888476d0e","Type":"ContainerStarted","Data":"2c15d93697396cd049bdf74c7491f330092cada891b76400522c0be682ba01f1"} Dec 04 09:44:58 crc kubenswrapper[4693]: E1204 09:44:58.744534 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-2d6g4" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" Dec 04 09:44:58 crc kubenswrapper[4693]: E1204 09:44:58.745206 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mdqwg" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" Dec 04 09:44:58 crc kubenswrapper[4693]: E1204 09:44:58.745671 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-sjp76" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.813050 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-74srj"] Dec 04 09:44:58 crc kubenswrapper[4693]: I1204 09:44:58.818034 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-74srj"] Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.134615 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2"] Dec 04 09:45:00 crc kubenswrapper[4693]: E1204 09:45:00.135103 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d90f3655-18d4-4dec-b9d4-7309fa424c4e" containerName="kube-multus-additional-cni-plugins" Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.135115 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d90f3655-18d4-4dec-b9d4-7309fa424c4e" containerName="kube-multus-additional-cni-plugins" Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.135203 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d90f3655-18d4-4dec-b9d4-7309fa424c4e" containerName="kube-multus-additional-cni-plugins" Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.135597 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2" Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.137402 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.137704 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.149460 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2"] Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.232920 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/225b0ea4-5e69-43d9-92b8-e54ff9cef03b-config-volume\") pod \"collect-profiles-29414025-qsdk2\" (UID: \"225b0ea4-5e69-43d9-92b8-e54ff9cef03b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2" Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.233008 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ndzd\" (UniqueName: \"kubernetes.io/projected/225b0ea4-5e69-43d9-92b8-e54ff9cef03b-kube-api-access-2ndzd\") pod \"collect-profiles-29414025-qsdk2\" (UID: \"225b0ea4-5e69-43d9-92b8-e54ff9cef03b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2" Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.233033 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/225b0ea4-5e69-43d9-92b8-e54ff9cef03b-secret-volume\") pod \"collect-profiles-29414025-qsdk2\" (UID: \"225b0ea4-5e69-43d9-92b8-e54ff9cef03b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2" Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.334221 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/225b0ea4-5e69-43d9-92b8-e54ff9cef03b-config-volume\") pod \"collect-profiles-29414025-qsdk2\" (UID: \"225b0ea4-5e69-43d9-92b8-e54ff9cef03b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2" Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.334290 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ndzd\" (UniqueName: \"kubernetes.io/projected/225b0ea4-5e69-43d9-92b8-e54ff9cef03b-kube-api-access-2ndzd\") pod \"collect-profiles-29414025-qsdk2\" (UID: \"225b0ea4-5e69-43d9-92b8-e54ff9cef03b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2" Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.334318 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/225b0ea4-5e69-43d9-92b8-e54ff9cef03b-secret-volume\") pod \"collect-profiles-29414025-qsdk2\" (UID: \"225b0ea4-5e69-43d9-92b8-e54ff9cef03b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2" Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.335662 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/225b0ea4-5e69-43d9-92b8-e54ff9cef03b-config-volume\") pod \"collect-profiles-29414025-qsdk2\" (UID: \"225b0ea4-5e69-43d9-92b8-e54ff9cef03b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2" Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.342090 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/225b0ea4-5e69-43d9-92b8-e54ff9cef03b-secret-volume\") pod \"collect-profiles-29414025-qsdk2\" (UID: \"225b0ea4-5e69-43d9-92b8-e54ff9cef03b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2" Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.352363 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ndzd\" (UniqueName: \"kubernetes.io/projected/225b0ea4-5e69-43d9-92b8-e54ff9cef03b-kube-api-access-2ndzd\") pod \"collect-profiles-29414025-qsdk2\" (UID: \"225b0ea4-5e69-43d9-92b8-e54ff9cef03b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2" Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.465066 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2" Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.473632 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d90f3655-18d4-4dec-b9d4-7309fa424c4e" path="/var/lib/kubelet/pods/d90f3655-18d4-4dec-b9d4-7309fa424c4e/volumes" Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.755079 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a72ca277-b61f-4242-9e81-0bbabbffab52","Type":"ContainerStarted","Data":"21ff572591168b10a6a743cd911e66de2f0a9b6e43a6866910cae55157f41dad"} Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.756680 4693 generic.go:334] "Generic (PLEG): container finished" podID="67f5b86e-5873-429a-afc0-aaa888476d0e" containerID="258e450c7988376583a49fee2412fa33951edda985648dbd57be086f403e653a" exitCode=0 Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.756763 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"67f5b86e-5873-429a-afc0-aaa888476d0e","Type":"ContainerDied","Data":"258e450c7988376583a49fee2412fa33951edda985648dbd57be086f403e653a"} Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.778376 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=50.778353454 podStartE2EDuration="50.778353454s" podCreationTimestamp="2025-12-04 09:44:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:45:00.771719673 +0000 UTC m=+146.669313446" watchObservedRunningTime="2025-12-04 09:45:00.778353454 +0000 UTC m=+146.675947207" Dec 04 09:45:00 crc kubenswrapper[4693]: I1204 09:45:00.851698 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2"] Dec 04 09:45:00 crc kubenswrapper[4693]: W1204 09:45:00.858420 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod225b0ea4_5e69_43d9_92b8_e54ff9cef03b.slice/crio-e7c34de57a6f28cd71a8c4f411dcbe2764ade3c50281af5ae665a66dfef0731a WatchSource:0}: Error finding container e7c34de57a6f28cd71a8c4f411dcbe2764ade3c50281af5ae665a66dfef0731a: Status 404 returned error can't find the container with id e7c34de57a6f28cd71a8c4f411dcbe2764ade3c50281af5ae665a66dfef0731a Dec 04 09:45:01 crc kubenswrapper[4693]: I1204 09:45:01.763540 4693 generic.go:334] "Generic (PLEG): container finished" podID="225b0ea4-5e69-43d9-92b8-e54ff9cef03b" containerID="08cabf8ca227bd33145df394354f51c5447c3d07671fbcc8c18c505bcd6da123" exitCode=0 Dec 04 09:45:01 crc kubenswrapper[4693]: I1204 09:45:01.763596 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2" event={"ID":"225b0ea4-5e69-43d9-92b8-e54ff9cef03b","Type":"ContainerDied","Data":"08cabf8ca227bd33145df394354f51c5447c3d07671fbcc8c18c505bcd6da123"} Dec 04 09:45:01 crc kubenswrapper[4693]: I1204 09:45:01.763634 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2" event={"ID":"225b0ea4-5e69-43d9-92b8-e54ff9cef03b","Type":"ContainerStarted","Data":"e7c34de57a6f28cd71a8c4f411dcbe2764ade3c50281af5ae665a66dfef0731a"} Dec 04 09:45:02 crc kubenswrapper[4693]: I1204 09:45:02.038499 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:45:02 crc kubenswrapper[4693]: I1204 09:45:02.157677 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67f5b86e-5873-429a-afc0-aaa888476d0e-kube-api-access\") pod \"67f5b86e-5873-429a-afc0-aaa888476d0e\" (UID: \"67f5b86e-5873-429a-afc0-aaa888476d0e\") " Dec 04 09:45:02 crc kubenswrapper[4693]: I1204 09:45:02.157777 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67f5b86e-5873-429a-afc0-aaa888476d0e-kubelet-dir\") pod \"67f5b86e-5873-429a-afc0-aaa888476d0e\" (UID: \"67f5b86e-5873-429a-afc0-aaa888476d0e\") " Dec 04 09:45:02 crc kubenswrapper[4693]: I1204 09:45:02.157906 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67f5b86e-5873-429a-afc0-aaa888476d0e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "67f5b86e-5873-429a-afc0-aaa888476d0e" (UID: "67f5b86e-5873-429a-afc0-aaa888476d0e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:45:02 crc kubenswrapper[4693]: I1204 09:45:02.158103 4693 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67f5b86e-5873-429a-afc0-aaa888476d0e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:02 crc kubenswrapper[4693]: I1204 09:45:02.162764 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67f5b86e-5873-429a-afc0-aaa888476d0e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "67f5b86e-5873-429a-afc0-aaa888476d0e" (UID: "67f5b86e-5873-429a-afc0-aaa888476d0e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:45:02 crc kubenswrapper[4693]: I1204 09:45:02.259440 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67f5b86e-5873-429a-afc0-aaa888476d0e-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:02 crc kubenswrapper[4693]: I1204 09:45:02.780875 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"67f5b86e-5873-429a-afc0-aaa888476d0e","Type":"ContainerDied","Data":"2c15d93697396cd049bdf74c7491f330092cada891b76400522c0be682ba01f1"} Dec 04 09:45:02 crc kubenswrapper[4693]: I1204 09:45:02.780920 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c15d93697396cd049bdf74c7491f330092cada891b76400522c0be682ba01f1" Dec 04 09:45:02 crc kubenswrapper[4693]: I1204 09:45:02.781010 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:45:03 crc kubenswrapper[4693]: I1204 09:45:03.059476 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2" Dec 04 09:45:03 crc kubenswrapper[4693]: I1204 09:45:03.176633 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/225b0ea4-5e69-43d9-92b8-e54ff9cef03b-secret-volume\") pod \"225b0ea4-5e69-43d9-92b8-e54ff9cef03b\" (UID: \"225b0ea4-5e69-43d9-92b8-e54ff9cef03b\") " Dec 04 09:45:03 crc kubenswrapper[4693]: I1204 09:45:03.176689 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ndzd\" (UniqueName: \"kubernetes.io/projected/225b0ea4-5e69-43d9-92b8-e54ff9cef03b-kube-api-access-2ndzd\") pod \"225b0ea4-5e69-43d9-92b8-e54ff9cef03b\" (UID: \"225b0ea4-5e69-43d9-92b8-e54ff9cef03b\") " Dec 04 09:45:03 crc kubenswrapper[4693]: I1204 09:45:03.176761 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/225b0ea4-5e69-43d9-92b8-e54ff9cef03b-config-volume\") pod \"225b0ea4-5e69-43d9-92b8-e54ff9cef03b\" (UID: \"225b0ea4-5e69-43d9-92b8-e54ff9cef03b\") " Dec 04 09:45:03 crc kubenswrapper[4693]: I1204 09:45:03.177611 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/225b0ea4-5e69-43d9-92b8-e54ff9cef03b-config-volume" (OuterVolumeSpecName: "config-volume") pod "225b0ea4-5e69-43d9-92b8-e54ff9cef03b" (UID: "225b0ea4-5e69-43d9-92b8-e54ff9cef03b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:45:03 crc kubenswrapper[4693]: I1204 09:45:03.182666 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/225b0ea4-5e69-43d9-92b8-e54ff9cef03b-kube-api-access-2ndzd" (OuterVolumeSpecName: "kube-api-access-2ndzd") pod "225b0ea4-5e69-43d9-92b8-e54ff9cef03b" (UID: "225b0ea4-5e69-43d9-92b8-e54ff9cef03b"). InnerVolumeSpecName "kube-api-access-2ndzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:45:03 crc kubenswrapper[4693]: I1204 09:45:03.184494 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/225b0ea4-5e69-43d9-92b8-e54ff9cef03b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "225b0ea4-5e69-43d9-92b8-e54ff9cef03b" (UID: "225b0ea4-5e69-43d9-92b8-e54ff9cef03b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:45:03 crc kubenswrapper[4693]: I1204 09:45:03.277887 4693 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/225b0ea4-5e69-43d9-92b8-e54ff9cef03b-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:03 crc kubenswrapper[4693]: I1204 09:45:03.277921 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ndzd\" (UniqueName: \"kubernetes.io/projected/225b0ea4-5e69-43d9-92b8-e54ff9cef03b-kube-api-access-2ndzd\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:03 crc kubenswrapper[4693]: I1204 09:45:03.277935 4693 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/225b0ea4-5e69-43d9-92b8-e54ff9cef03b-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:03 crc kubenswrapper[4693]: I1204 09:45:03.791282 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2" event={"ID":"225b0ea4-5e69-43d9-92b8-e54ff9cef03b","Type":"ContainerDied","Data":"e7c34de57a6f28cd71a8c4f411dcbe2764ade3c50281af5ae665a66dfef0731a"} Dec 04 09:45:03 crc kubenswrapper[4693]: I1204 09:45:03.791630 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7c34de57a6f28cd71a8c4f411dcbe2764ade3c50281af5ae665a66dfef0731a" Dec 04 09:45:03 crc kubenswrapper[4693]: I1204 09:45:03.791323 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2" Dec 04 09:45:22 crc kubenswrapper[4693]: I1204 09:45:22.273549 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:45:22 crc kubenswrapper[4693]: I1204 09:45:22.274293 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.685108 4693 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.686461 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://68f8f9fc1cae80cd0ad4be20d8efc260c5e201546591544d8bc7db5af6e9f48a" gracePeriod=15 Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.686548 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://e74641d799edb6a4f5e88804ac71bed808d23cfbacfc7a827a8ac3b3e35b910a" gracePeriod=15 Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.686577 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://4fc5a089f9adfcecadb1c36dfb0e79952b135122b997f3d2caa36ff14c82d244" gracePeriod=15 Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.686541 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://04bdd206f1b3b49e2e20493e6447668c3de1675e8a8264a5d44cdda619862b81" gracePeriod=15 Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.686486 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://5830ae0efa54f837c13ae098b4dac7e3c73250f55e2641e916dbb30c89c829b5" gracePeriod=15 Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.691015 4693 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 09:45:37 crc kubenswrapper[4693]: E1204 09:45:37.691525 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.691633 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 09:45:37 crc kubenswrapper[4693]: E1204 09:45:37.691656 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.691669 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 04 09:45:37 crc kubenswrapper[4693]: E1204 09:45:37.691690 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.691703 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 04 09:45:37 crc kubenswrapper[4693]: E1204 09:45:37.691725 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.691738 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 04 09:45:37 crc kubenswrapper[4693]: E1204 09:45:37.691753 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="225b0ea4-5e69-43d9-92b8-e54ff9cef03b" containerName="collect-profiles" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.691765 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="225b0ea4-5e69-43d9-92b8-e54ff9cef03b" containerName="collect-profiles" Dec 04 09:45:37 crc kubenswrapper[4693]: E1204 09:45:37.691878 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.691892 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 04 09:45:37 crc kubenswrapper[4693]: E1204 09:45:37.691914 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.691927 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 09:45:37 crc kubenswrapper[4693]: E1204 09:45:37.691983 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67f5b86e-5873-429a-afc0-aaa888476d0e" containerName="pruner" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.691999 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f5b86e-5873-429a-afc0-aaa888476d0e" containerName="pruner" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.694195 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.694239 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="67f5b86e-5873-429a-afc0-aaa888476d0e" containerName="pruner" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.694658 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.694950 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="225b0ea4-5e69-43d9-92b8-e54ff9cef03b" containerName="collect-profiles" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.695046 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.695069 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.695090 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.695106 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 04 09:45:37 crc kubenswrapper[4693]: E1204 09:45:37.695767 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.695990 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.702225 4693 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.703250 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.707694 4693 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.764101 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.764173 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.764204 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.764255 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.764297 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.764321 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.764360 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.764391 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.865503 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.865894 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.865914 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.865934 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.865967 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.865986 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.866007 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.866027 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.866103 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.865754 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.866155 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.866177 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.866199 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.866219 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.866238 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:45:37 crc kubenswrapper[4693]: I1204 09:45:37.866260 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:45:38 crc kubenswrapper[4693]: I1204 09:45:38.022973 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 04 09:45:38 crc kubenswrapper[4693]: I1204 09:45:38.024159 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 09:45:38 crc kubenswrapper[4693]: I1204 09:45:38.024829 4693 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5830ae0efa54f837c13ae098b4dac7e3c73250f55e2641e916dbb30c89c829b5" exitCode=0 Dec 04 09:45:38 crc kubenswrapper[4693]: I1204 09:45:38.024934 4693 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e74641d799edb6a4f5e88804ac71bed808d23cfbacfc7a827a8ac3b3e35b910a" exitCode=0 Dec 04 09:45:38 crc kubenswrapper[4693]: I1204 09:45:38.025012 4693 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="04bdd206f1b3b49e2e20493e6447668c3de1675e8a8264a5d44cdda619862b81" exitCode=0 Dec 04 09:45:38 crc kubenswrapper[4693]: I1204 09:45:38.025085 4693 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4fc5a089f9adfcecadb1c36dfb0e79952b135122b997f3d2caa36ff14c82d244" exitCode=2 Dec 04 09:45:38 crc kubenswrapper[4693]: I1204 09:45:38.024967 4693 scope.go:117] "RemoveContainer" containerID="ba70fba21706d5cd46c7264ae364f0e6d05bb6833ffa54262e88734a2f45cf81" Dec 04 09:45:38 crc kubenswrapper[4693]: I1204 09:45:38.026912 4693 generic.go:334] "Generic (PLEG): container finished" podID="a72ca277-b61f-4242-9e81-0bbabbffab52" containerID="21ff572591168b10a6a743cd911e66de2f0a9b6e43a6866910cae55157f41dad" exitCode=0 Dec 04 09:45:38 crc kubenswrapper[4693]: I1204 09:45:38.027001 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a72ca277-b61f-4242-9e81-0bbabbffab52","Type":"ContainerDied","Data":"21ff572591168b10a6a743cd911e66de2f0a9b6e43a6866910cae55157f41dad"} Dec 04 09:45:38 crc kubenswrapper[4693]: I1204 09:45:38.028173 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:38 crc kubenswrapper[4693]: E1204 09:45:38.811740 4693 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.195:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-2d6g4.187dfa076da7d528 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-2d6g4,UID:7070356d-e89a-4f1c-a247-051bf520ae02,APIVersion:v1,ResourceVersion:28450,FieldPath:spec.initContainers{extract-content},},Reason:Pulled,Message:Successfully pulled image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\" in 6.923s (6.923s including waiting). Image size: 1129027903 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 09:45:38.81053316 +0000 UTC m=+184.708126913,LastTimestamp:2025-12-04 09:45:38.81053316 +0000 UTC m=+184.708126913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 09:45:39 crc kubenswrapper[4693]: I1204 09:45:39.036083 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 09:45:39 crc kubenswrapper[4693]: I1204 09:45:39.295623 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:45:39 crc kubenswrapper[4693]: I1204 09:45:39.296347 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:39 crc kubenswrapper[4693]: I1204 09:45:39.388808 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a72ca277-b61f-4242-9e81-0bbabbffab52-kube-api-access\") pod \"a72ca277-b61f-4242-9e81-0bbabbffab52\" (UID: \"a72ca277-b61f-4242-9e81-0bbabbffab52\") " Dec 04 09:45:39 crc kubenswrapper[4693]: I1204 09:45:39.389202 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a72ca277-b61f-4242-9e81-0bbabbffab52-kubelet-dir\") pod \"a72ca277-b61f-4242-9e81-0bbabbffab52\" (UID: \"a72ca277-b61f-4242-9e81-0bbabbffab52\") " Dec 04 09:45:39 crc kubenswrapper[4693]: I1204 09:45:39.389242 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a72ca277-b61f-4242-9e81-0bbabbffab52-var-lock\") pod \"a72ca277-b61f-4242-9e81-0bbabbffab52\" (UID: \"a72ca277-b61f-4242-9e81-0bbabbffab52\") " Dec 04 09:45:39 crc kubenswrapper[4693]: I1204 09:45:39.389343 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a72ca277-b61f-4242-9e81-0bbabbffab52-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a72ca277-b61f-4242-9e81-0bbabbffab52" (UID: "a72ca277-b61f-4242-9e81-0bbabbffab52"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:45:39 crc kubenswrapper[4693]: I1204 09:45:39.389404 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a72ca277-b61f-4242-9e81-0bbabbffab52-var-lock" (OuterVolumeSpecName: "var-lock") pod "a72ca277-b61f-4242-9e81-0bbabbffab52" (UID: "a72ca277-b61f-4242-9e81-0bbabbffab52"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:45:39 crc kubenswrapper[4693]: I1204 09:45:39.389634 4693 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a72ca277-b61f-4242-9e81-0bbabbffab52-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:39 crc kubenswrapper[4693]: I1204 09:45:39.389654 4693 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a72ca277-b61f-4242-9e81-0bbabbffab52-var-lock\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:39 crc kubenswrapper[4693]: I1204 09:45:39.394811 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a72ca277-b61f-4242-9e81-0bbabbffab52-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a72ca277-b61f-4242-9e81-0bbabbffab52" (UID: "a72ca277-b61f-4242-9e81-0bbabbffab52"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:45:39 crc kubenswrapper[4693]: I1204 09:45:39.490567 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a72ca277-b61f-4242-9e81-0bbabbffab52-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.041674 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.046104 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.046648 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.046940 4693 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.047762 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7q9x" event={"ID":"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5","Type":"ContainerStarted","Data":"8a58ec9a1f30ead081ddd42026fe0495974cc6b81d86016b1e139a0e6c80bc40"} Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.048615 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.048775 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.048916 4693 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.050430 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnfsl" event={"ID":"ac8a0ee1-b340-421c-8496-74757e180a20","Type":"ContainerStarted","Data":"e139e343accd1785464899b87f152c13d96a827cda91a862accaad2209d96dc5"} Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.050984 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.051296 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.051553 4693 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.051909 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.051911 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d6g4" event={"ID":"7070356d-e89a-4f1c-a247-051bf520ae02","Type":"ContainerStarted","Data":"4a0f8998eea98d262bb44f1c07beb6ff91d86e8d3af2100e0716445466d0b046"} Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.052363 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.052662 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.052855 4693 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.053071 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.053309 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rl2n" event={"ID":"2bedbed9-581a-414f-a92c-fc4933fcac93","Type":"ContainerStarted","Data":"951b458475e15d0a1b8df4e664c6df7a558952840f7f6cce525851b2827dccf6"} Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.053645 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.054296 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.054654 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.054886 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.055190 4693 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.055459 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a72ca277-b61f-4242-9e81-0bbabbffab52","Type":"ContainerDied","Data":"7a005066918c6819bc65b4805f02943f7c76fddeb07d51f8016414585fc85619"} Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.055562 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a005066918c6819bc65b4805f02943f7c76fddeb07d51f8016414585fc85619" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.055478 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.055813 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.056024 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.059145 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjp76" event={"ID":"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f","Type":"ContainerStarted","Data":"5c8dee3ec09fd90e28c4aa8718af4b68356af5f16b59064bf0beb0fe3db0ca70"} Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.060016 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.060314 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.060519 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.060725 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7zjm" event={"ID":"a7c911e8-e91f-4c2d-9c38-d6bce33a819f","Type":"ContainerStarted","Data":"edae1bd887364324f804eee6833761767c24a8164e059114e74a572b436a2e53"} Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.060834 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.061179 4693 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.061643 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.061891 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.062410 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.063440 4693 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.063590 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.063780 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.064127 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.064600 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.064799 4693 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="68f8f9fc1cae80cd0ad4be20d8efc260c5e201546591544d8bc7db5af6e9f48a" exitCode=0 Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.064929 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.065051 4693 scope.go:117] "RemoveContainer" containerID="5830ae0efa54f837c13ae098b4dac7e3c73250f55e2641e916dbb30c89c829b5" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.064917 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.065438 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.065751 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.081723 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgqzj" event={"ID":"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a","Type":"ContainerStarted","Data":"a2ea22eec436822d85593c735a68b6b48c73e91f258b9288e8290a67e3ac159e"} Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.082548 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.082797 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.083208 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.083557 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.083827 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.084049 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqwg" event={"ID":"de8930f8-8514-4179-a3b6-3408199d5cd8","Type":"ContainerStarted","Data":"82a5053a29cc2cc789ed63e8c5169636dcb16d3010ba13cb1e4731ddb9b4f609"} Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.084195 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.084474 4693 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.085453 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.085830 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.086129 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.086448 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.086878 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.087763 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.088073 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.088421 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.088725 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.089537 4693 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.089918 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.090201 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.097127 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.097183 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.097206 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.097268 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.097321 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.097363 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.098646 4693 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.098675 4693 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.098685 4693 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.155424 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.156045 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.156448 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.156801 4693 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.157095 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.157385 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.157764 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.158144 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.158441 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.158780 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.165567 4693 scope.go:117] "RemoveContainer" containerID="e74641d799edb6a4f5e88804ac71bed808d23cfbacfc7a827a8ac3b3e35b910a" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.183926 4693 scope.go:117] "RemoveContainer" containerID="04bdd206f1b3b49e2e20493e6447668c3de1675e8a8264a5d44cdda619862b81" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.196989 4693 scope.go:117] "RemoveContainer" containerID="4fc5a089f9adfcecadb1c36dfb0e79952b135122b997f3d2caa36ff14c82d244" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.452408 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.452867 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.453171 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.453489 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.453771 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.454057 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.454404 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.454686 4693 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.454984 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.455258 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.459627 4693 scope.go:117] "RemoveContainer" containerID="68f8f9fc1cae80cd0ad4be20d8efc260c5e201546591544d8bc7db5af6e9f48a" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.472360 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.485041 4693 scope.go:117] "RemoveContainer" containerID="eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.643579 4693 scope.go:117] "RemoveContainer" containerID="5830ae0efa54f837c13ae098b4dac7e3c73250f55e2641e916dbb30c89c829b5" Dec 04 09:45:40 crc kubenswrapper[4693]: E1204 09:45:40.644515 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5830ae0efa54f837c13ae098b4dac7e3c73250f55e2641e916dbb30c89c829b5\": container with ID starting with 5830ae0efa54f837c13ae098b4dac7e3c73250f55e2641e916dbb30c89c829b5 not found: ID does not exist" containerID="5830ae0efa54f837c13ae098b4dac7e3c73250f55e2641e916dbb30c89c829b5" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.644585 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5830ae0efa54f837c13ae098b4dac7e3c73250f55e2641e916dbb30c89c829b5"} err="failed to get container status \"5830ae0efa54f837c13ae098b4dac7e3c73250f55e2641e916dbb30c89c829b5\": rpc error: code = NotFound desc = could not find container \"5830ae0efa54f837c13ae098b4dac7e3c73250f55e2641e916dbb30c89c829b5\": container with ID starting with 5830ae0efa54f837c13ae098b4dac7e3c73250f55e2641e916dbb30c89c829b5 not found: ID does not exist" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.644673 4693 scope.go:117] "RemoveContainer" containerID="e74641d799edb6a4f5e88804ac71bed808d23cfbacfc7a827a8ac3b3e35b910a" Dec 04 09:45:40 crc kubenswrapper[4693]: E1204 09:45:40.645272 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e74641d799edb6a4f5e88804ac71bed808d23cfbacfc7a827a8ac3b3e35b910a\": container with ID starting with e74641d799edb6a4f5e88804ac71bed808d23cfbacfc7a827a8ac3b3e35b910a not found: ID does not exist" containerID="e74641d799edb6a4f5e88804ac71bed808d23cfbacfc7a827a8ac3b3e35b910a" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.645315 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e74641d799edb6a4f5e88804ac71bed808d23cfbacfc7a827a8ac3b3e35b910a"} err="failed to get container status \"e74641d799edb6a4f5e88804ac71bed808d23cfbacfc7a827a8ac3b3e35b910a\": rpc error: code = NotFound desc = could not find container \"e74641d799edb6a4f5e88804ac71bed808d23cfbacfc7a827a8ac3b3e35b910a\": container with ID starting with e74641d799edb6a4f5e88804ac71bed808d23cfbacfc7a827a8ac3b3e35b910a not found: ID does not exist" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.645365 4693 scope.go:117] "RemoveContainer" containerID="04bdd206f1b3b49e2e20493e6447668c3de1675e8a8264a5d44cdda619862b81" Dec 04 09:45:40 crc kubenswrapper[4693]: E1204 09:45:40.646002 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04bdd206f1b3b49e2e20493e6447668c3de1675e8a8264a5d44cdda619862b81\": container with ID starting with 04bdd206f1b3b49e2e20493e6447668c3de1675e8a8264a5d44cdda619862b81 not found: ID does not exist" containerID="04bdd206f1b3b49e2e20493e6447668c3de1675e8a8264a5d44cdda619862b81" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.646108 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04bdd206f1b3b49e2e20493e6447668c3de1675e8a8264a5d44cdda619862b81"} err="failed to get container status \"04bdd206f1b3b49e2e20493e6447668c3de1675e8a8264a5d44cdda619862b81\": rpc error: code = NotFound desc = could not find container \"04bdd206f1b3b49e2e20493e6447668c3de1675e8a8264a5d44cdda619862b81\": container with ID starting with 04bdd206f1b3b49e2e20493e6447668c3de1675e8a8264a5d44cdda619862b81 not found: ID does not exist" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.646137 4693 scope.go:117] "RemoveContainer" containerID="4fc5a089f9adfcecadb1c36dfb0e79952b135122b997f3d2caa36ff14c82d244" Dec 04 09:45:40 crc kubenswrapper[4693]: E1204 09:45:40.648799 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fc5a089f9adfcecadb1c36dfb0e79952b135122b997f3d2caa36ff14c82d244\": container with ID starting with 4fc5a089f9adfcecadb1c36dfb0e79952b135122b997f3d2caa36ff14c82d244 not found: ID does not exist" containerID="4fc5a089f9adfcecadb1c36dfb0e79952b135122b997f3d2caa36ff14c82d244" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.648879 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fc5a089f9adfcecadb1c36dfb0e79952b135122b997f3d2caa36ff14c82d244"} err="failed to get container status \"4fc5a089f9adfcecadb1c36dfb0e79952b135122b997f3d2caa36ff14c82d244\": rpc error: code = NotFound desc = could not find container \"4fc5a089f9adfcecadb1c36dfb0e79952b135122b997f3d2caa36ff14c82d244\": container with ID starting with 4fc5a089f9adfcecadb1c36dfb0e79952b135122b997f3d2caa36ff14c82d244 not found: ID does not exist" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.648928 4693 scope.go:117] "RemoveContainer" containerID="68f8f9fc1cae80cd0ad4be20d8efc260c5e201546591544d8bc7db5af6e9f48a" Dec 04 09:45:40 crc kubenswrapper[4693]: E1204 09:45:40.649683 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68f8f9fc1cae80cd0ad4be20d8efc260c5e201546591544d8bc7db5af6e9f48a\": container with ID starting with 68f8f9fc1cae80cd0ad4be20d8efc260c5e201546591544d8bc7db5af6e9f48a not found: ID does not exist" containerID="68f8f9fc1cae80cd0ad4be20d8efc260c5e201546591544d8bc7db5af6e9f48a" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.649764 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f8f9fc1cae80cd0ad4be20d8efc260c5e201546591544d8bc7db5af6e9f48a"} err="failed to get container status \"68f8f9fc1cae80cd0ad4be20d8efc260c5e201546591544d8bc7db5af6e9f48a\": rpc error: code = NotFound desc = could not find container \"68f8f9fc1cae80cd0ad4be20d8efc260c5e201546591544d8bc7db5af6e9f48a\": container with ID starting with 68f8f9fc1cae80cd0ad4be20d8efc260c5e201546591544d8bc7db5af6e9f48a not found: ID does not exist" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.649808 4693 scope.go:117] "RemoveContainer" containerID="eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993" Dec 04 09:45:40 crc kubenswrapper[4693]: E1204 09:45:40.650467 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993\": container with ID starting with eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993 not found: ID does not exist" containerID="eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993" Dec 04 09:45:40 crc kubenswrapper[4693]: I1204 09:45:40.650521 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993"} err="failed to get container status \"eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993\": rpc error: code = NotFound desc = could not find container \"eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993\": container with ID starting with eaa14047e470bf84a2d7b854ec045f2566969221b63563d09e82892e93d1a993 not found: ID does not exist" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.091012 4693 generic.go:334] "Generic (PLEG): container finished" podID="ac8a0ee1-b340-421c-8496-74757e180a20" containerID="e139e343accd1785464899b87f152c13d96a827cda91a862accaad2209d96dc5" exitCode=0 Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.091099 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnfsl" event={"ID":"ac8a0ee1-b340-421c-8496-74757e180a20","Type":"ContainerDied","Data":"e139e343accd1785464899b87f152c13d96a827cda91a862accaad2209d96dc5"} Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.091895 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.092249 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.093116 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.093394 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.093563 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.093786 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.094049 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.094238 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.094455 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.095097 4693 generic.go:334] "Generic (PLEG): container finished" podID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" containerID="a2ea22eec436822d85593c735a68b6b48c73e91f258b9288e8290a67e3ac159e" exitCode=0 Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.095143 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgqzj" event={"ID":"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a","Type":"ContainerDied","Data":"a2ea22eec436822d85593c735a68b6b48c73e91f258b9288e8290a67e3ac159e"} Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.095722 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.095953 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.096165 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.096399 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.096675 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.096940 4693 generic.go:334] "Generic (PLEG): container finished" podID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" containerID="5c8dee3ec09fd90e28c4aa8718af4b68356af5f16b59064bf0beb0fe3db0ca70" exitCode=0 Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.096989 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.097021 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjp76" event={"ID":"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f","Type":"ContainerDied","Data":"5c8dee3ec09fd90e28c4aa8718af4b68356af5f16b59064bf0beb0fe3db0ca70"} Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.097359 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.097638 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.097927 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.098363 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.098669 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.098952 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.099269 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.099819 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.100001 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.100204 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.100461 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.100760 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.107710 4693 generic.go:334] "Generic (PLEG): container finished" podID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" containerID="edae1bd887364324f804eee6833761767c24a8164e059114e74a572b436a2e53" exitCode=0 Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.107792 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7zjm" event={"ID":"a7c911e8-e91f-4c2d-9c38-d6bce33a819f","Type":"ContainerDied","Data":"edae1bd887364324f804eee6833761767c24a8164e059114e74a572b436a2e53"} Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.108971 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.109156 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.109389 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.110078 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.110238 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.110848 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.111323 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.111628 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.111937 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.117107 4693 generic.go:334] "Generic (PLEG): container finished" podID="7070356d-e89a-4f1c-a247-051bf520ae02" containerID="4a0f8998eea98d262bb44f1c07beb6ff91d86e8d3af2100e0716445466d0b046" exitCode=0 Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.117190 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d6g4" event={"ID":"7070356d-e89a-4f1c-a247-051bf520ae02","Type":"ContainerDied","Data":"4a0f8998eea98d262bb44f1c07beb6ff91d86e8d3af2100e0716445466d0b046"} Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.118371 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.118597 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.119370 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.120483 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.120710 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.121061 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.121234 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.121408 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.121627 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.124026 4693 generic.go:334] "Generic (PLEG): container finished" podID="2bedbed9-581a-414f-a92c-fc4933fcac93" containerID="951b458475e15d0a1b8df4e664c6df7a558952840f7f6cce525851b2827dccf6" exitCode=0 Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.124049 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rl2n" event={"ID":"2bedbed9-581a-414f-a92c-fc4933fcac93","Type":"ContainerDied","Data":"951b458475e15d0a1b8df4e664c6df7a558952840f7f6cce525851b2827dccf6"} Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.125005 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.125149 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.125370 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.125591 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.125829 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.126090 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.126272 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.126554 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.126804 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.127977 4693 generic.go:334] "Generic (PLEG): container finished" podID="de8930f8-8514-4179-a3b6-3408199d5cd8" containerID="82a5053a29cc2cc789ed63e8c5169636dcb16d3010ba13cb1e4731ddb9b4f609" exitCode=0 Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.128055 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqwg" event={"ID":"de8930f8-8514-4179-a3b6-3408199d5cd8","Type":"ContainerDied","Data":"82a5053a29cc2cc789ed63e8c5169636dcb16d3010ba13cb1e4731ddb9b4f609"} Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.128092 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqwg" event={"ID":"de8930f8-8514-4179-a3b6-3408199d5cd8","Type":"ContainerStarted","Data":"60100c3de7522efcb8c647299a25d3e0a84651e81001ea3931e7f138ff77ddf0"} Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.128574 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.128840 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.129189 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.129479 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.129720 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.130136 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.130530 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.130799 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.131026 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.133127 4693 generic.go:334] "Generic (PLEG): container finished" podID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" containerID="8a58ec9a1f30ead081ddd42026fe0495974cc6b81d86016b1e139a0e6c80bc40" exitCode=0 Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.133176 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7q9x" event={"ID":"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5","Type":"ContainerDied","Data":"8a58ec9a1f30ead081ddd42026fe0495974cc6b81d86016b1e139a0e6c80bc40"} Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.133547 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.133753 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.134031 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.134383 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.134574 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.134890 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.135198 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.136147 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:41 crc kubenswrapper[4693]: I1204 09:45:41.136393 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.149652 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7zjm" event={"ID":"a7c911e8-e91f-4c2d-9c38-d6bce33a819f","Type":"ContainerStarted","Data":"222f992a57089b16283cd25822fb7fda97ee625ca48ce0858a85b2516afb98f1"} Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.151064 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.151237 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.151406 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.151592 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.151903 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.152356 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.152636 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.152859 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.153128 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.153912 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnfsl" event={"ID":"ac8a0ee1-b340-421c-8496-74757e180a20","Type":"ContainerStarted","Data":"671bfcf7ed0344ccc65f037f59c8e38257a1066ebd1d09eadfe63460f1c9fcae"} Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.154832 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.155002 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.155206 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.155675 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.155897 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.156077 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.156239 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.156469 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.156735 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.157498 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d6g4" event={"ID":"7070356d-e89a-4f1c-a247-051bf520ae02","Type":"ContainerStarted","Data":"9635674fc4bf52b71ddfa0cf7ed55d8004b61d30106030e9db26618f8e381f86"} Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.158065 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.158242 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.158449 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.158668 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.158841 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.159102 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.159271 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.159512 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.159756 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.160405 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgqzj" event={"ID":"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a","Type":"ContainerStarted","Data":"ad9ccae979860da0683251b0407431fb69c858888d1b76eaf3e620a02ea26e16"} Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.160822 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.161053 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.163121 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.163442 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.163759 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.164069 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.164364 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.164628 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.164808 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.165621 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rl2n" event={"ID":"2bedbed9-581a-414f-a92c-fc4933fcac93","Type":"ContainerStarted","Data":"18de67ff61d1c233beed5d267b7a8bb8263a342eafc1750e3f3427561c5d0ea0"} Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.166183 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.166411 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.166554 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.166764 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.167040 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.167206 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.167430 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.167672 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.167937 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.168321 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjp76" event={"ID":"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f","Type":"ContainerStarted","Data":"d1737e7eb20e352c0f12ab3f050b9edaaa02bff4600f9d351564ba6492f53087"} Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.168978 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.169124 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.169269 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.169426 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.169614 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.169803 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.170047 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.170278 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.170475 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.171450 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7q9x" event={"ID":"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5","Type":"ContainerStarted","Data":"a71348ae53add427709b4a8def892e7afc554f50c8b53fe8790527ac68cae297"} Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.171985 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.172136 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.172363 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.172732 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.172899 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.173045 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.187674 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.207229 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.227745 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:42 crc kubenswrapper[4693]: E1204 09:45:42.721773 4693 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.195:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:45:42 crc kubenswrapper[4693]: I1204 09:45:42.722182 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:45:43 crc kubenswrapper[4693]: I1204 09:45:43.187571 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"838d029d10876585be32f081854df97ff2cc484d18c24834321f465c3362a503"} Dec 04 09:45:43 crc kubenswrapper[4693]: I1204 09:45:43.188076 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7837727dc09901aaecf550235cab9d32dcd9562106cf3ea867aec6cc409bc02d"} Dec 04 09:45:43 crc kubenswrapper[4693]: I1204 09:45:43.190371 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:43 crc kubenswrapper[4693]: E1204 09:45:43.190422 4693 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.195:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:45:43 crc kubenswrapper[4693]: I1204 09:45:43.191145 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:43 crc kubenswrapper[4693]: I1204 09:45:43.191478 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:43 crc kubenswrapper[4693]: I1204 09:45:43.191690 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:43 crc kubenswrapper[4693]: I1204 09:45:43.191864 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:43 crc kubenswrapper[4693]: I1204 09:45:43.192025 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:43 crc kubenswrapper[4693]: I1204 09:45:43.192193 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:43 crc kubenswrapper[4693]: I1204 09:45:43.192374 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:43 crc kubenswrapper[4693]: I1204 09:45:43.192540 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:44 crc kubenswrapper[4693]: I1204 09:45:44.464412 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:44 crc kubenswrapper[4693]: I1204 09:45:44.465022 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:44 crc kubenswrapper[4693]: I1204 09:45:44.465420 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:44 crc kubenswrapper[4693]: I1204 09:45:44.465790 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:44 crc kubenswrapper[4693]: I1204 09:45:44.466056 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:44 crc kubenswrapper[4693]: I1204 09:45:44.466346 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:44 crc kubenswrapper[4693]: I1204 09:45:44.466578 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:44 crc kubenswrapper[4693]: I1204 09:45:44.466798 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:44 crc kubenswrapper[4693]: I1204 09:45:44.467027 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:45 crc kubenswrapper[4693]: E1204 09:45:45.134300 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:45:45Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:45:45Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:45:45Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:45:45Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[],\\\"sizeBytes\\\":1610175307},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[],\\\"sizeBytes\\\":1205106509},{\\\"names\\\":[],\\\"sizeBytes\\\":1201434959},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:45 crc kubenswrapper[4693]: E1204 09:45:45.134874 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:45 crc kubenswrapper[4693]: E1204 09:45:45.135315 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:45 crc kubenswrapper[4693]: E1204 09:45:45.135630 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:45 crc kubenswrapper[4693]: E1204 09:45:45.135988 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:45 crc kubenswrapper[4693]: E1204 09:45:45.136034 4693 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 09:45:46 crc kubenswrapper[4693]: E1204 09:45:46.999843 4693 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: E1204 09:45:47.000051 4693 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: E1204 09:45:47.000266 4693 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: E1204 09:45:47.000473 4693 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: E1204 09:45:47.000659 4693 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.000681 4693 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 04 09:45:47 crc kubenswrapper[4693]: E1204 09:45:47.000857 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="200ms" Dec 04 09:45:47 crc kubenswrapper[4693]: E1204 09:45:47.201450 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="400ms" Dec 04 09:45:47 crc kubenswrapper[4693]: E1204 09:45:47.433807 4693 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.195:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-2d6g4.187dfa076da7d528 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-2d6g4,UID:7070356d-e89a-4f1c-a247-051bf520ae02,APIVersion:v1,ResourceVersion:28450,FieldPath:spec.initContainers{extract-content},},Reason:Pulled,Message:Successfully pulled image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\" in 6.923s (6.923s including waiting). Image size: 1129027903 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 09:45:38.81053316 +0000 UTC m=+184.708126913,LastTimestamp:2025-12-04 09:45:38.81053316 +0000 UTC m=+184.708126913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.477723 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c7zjm" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.477778 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c7zjm" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.539636 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c7zjm" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.540349 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.540665 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.541080 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.541401 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.541655 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.541935 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.542210 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.542560 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.542788 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: E1204 09:45:47.602566 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="800ms" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.693319 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5rl2n" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.693663 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5rl2n" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.736373 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5rl2n" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.736834 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.737015 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.737308 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.738148 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.738361 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.738548 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.738753 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.738945 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.739155 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.893415 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l7q9x" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.893477 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l7q9x" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.937254 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l7q9x" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.937717 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.938034 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.938364 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.938689 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.939138 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.939496 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.939743 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.940163 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:47 crc kubenswrapper[4693]: I1204 09:45:47.940473 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.263037 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c7zjm" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.263800 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.264254 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.264742 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.265158 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.266240 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.266840 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.267115 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.267163 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l7q9x" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.267498 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.267936 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.268534 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.268863 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.269166 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.269666 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.269973 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.271519 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.271776 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.272108 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.272473 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.282407 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5rl2n" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.282912 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.283278 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.283488 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.283722 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.284028 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.284325 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.284724 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.284977 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.285241 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: E1204 09:45:48.403719 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="1.6s" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.605254 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nnfsl" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.605320 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nnfsl" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.652637 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nnfsl" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.653401 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.653773 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.654193 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.654758 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.655164 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.655751 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.656235 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.656697 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:48 crc kubenswrapper[4693]: I1204 09:45:48.657191 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.261638 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nnfsl" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.262251 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.262668 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.262938 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.263192 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.263460 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.263725 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.263963 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.264215 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.264497 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.903706 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2d6g4" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.903754 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2d6g4" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.943257 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2d6g4" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.943985 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.944347 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.944780 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.945146 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.945441 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.945796 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.946262 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.946755 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:49 crc kubenswrapper[4693]: I1204 09:45:49.947100 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: E1204 09:45:50.005188 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="3.2s" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.255786 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mdqwg" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.255862 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mdqwg" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.292511 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mdqwg" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.293283 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.293847 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.294306 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.294646 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.295222 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.295822 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.296197 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.296771 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.296825 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2d6g4" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.297197 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.297670 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.298067 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.298508 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.298865 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.299220 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.299645 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.299948 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.300299 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.300792 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.460605 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.461560 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.462281 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.463116 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.463553 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.464248 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.464640 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.465060 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.465410 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.465864 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.486575 4693 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cd9459fc-2791-47a8-8de4-2ed953140a1a" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.486611 4693 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cd9459fc-2791-47a8-8de4-2ed953140a1a" Dec 04 09:45:50 crc kubenswrapper[4693]: E1204 09:45:50.487194 4693 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.487954 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.653693 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wgqzj" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.653747 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wgqzj" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.697301 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wgqzj" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.698110 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.698452 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.698880 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.699131 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.699497 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.700119 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.700391 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.701043 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.701379 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.848200 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sjp76" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.848265 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sjp76" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.894609 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sjp76" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.895738 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.896219 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.896801 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.897256 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.897676 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.897972 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.898226 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.898582 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:50 crc kubenswrapper[4693]: I1204 09:45:50.899051 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.236898 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4d4a3daf6b48384f09cc635ade40a65a3f8d105f31caf966d6dad1cc0998efd7"} Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.272409 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wgqzj" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.272850 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.273023 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.273485 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.274088 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sjp76" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.274201 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.274666 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.275242 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.275472 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.275637 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.275783 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.276031 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.276221 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.276445 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.276709 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.276871 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.277032 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.277190 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.277361 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.277574 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.284720 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mdqwg" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.285045 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.285208 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.285398 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.285616 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.285772 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.285918 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.286057 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.286197 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:51 crc kubenswrapper[4693]: I1204 09:45:51.286506 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:52 crc kubenswrapper[4693]: I1204 09:45:52.272765 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:45:52 crc kubenswrapper[4693]: I1204 09:45:52.272837 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:45:53 crc kubenswrapper[4693]: E1204 09:45:53.206534 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="6.4s" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.248159 4693 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="9d37755e639d5a67ec91619b24bb51ca02d51bebb2073a68f0579e376ef91b4d" exitCode=0 Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.248242 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"9d37755e639d5a67ec91619b24bb51ca02d51bebb2073a68f0579e376ef91b4d"} Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.248502 4693 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cd9459fc-2791-47a8-8de4-2ed953140a1a" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.248533 4693 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cd9459fc-2791-47a8-8de4-2ed953140a1a" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.248861 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:53 crc kubenswrapper[4693]: E1204 09:45:53.248947 4693 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.249083 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.249276 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.249564 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.249935 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.250145 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.250362 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.250562 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.250749 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.251020 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.251072 4693 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="bf86e2bd0e3a8a381534887778c1c978c95f7075a924928163e4c11ec30ee13d" exitCode=1 Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.251100 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"bf86e2bd0e3a8a381534887778c1c978c95f7075a924928163e4c11ec30ee13d"} Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.251542 4693 scope.go:117] "RemoveContainer" containerID="bf86e2bd0e3a8a381534887778c1c978c95f7075a924928163e4c11ec30ee13d" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.251685 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.251943 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.252211 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.252456 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.252658 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.252890 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.253159 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.253408 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.253723 4693 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:53 crc kubenswrapper[4693]: I1204 09:45:53.254396 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:54 crc kubenswrapper[4693]: I1204 09:45:54.259411 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 04 09:45:54 crc kubenswrapper[4693]: I1204 09:45:54.259720 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d86f6d48b05fd78dd1193b3bcc27e18bd9e7510817dbc59be18e2ec9f0c3c1d7"} Dec 04 09:45:54 crc kubenswrapper[4693]: I1204 09:45:54.466821 4693 status_manager.go:851] "Failed to get status for pod" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" pod="openshift-marketplace/redhat-marketplace-2d6g4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2d6g4\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:54 crc kubenswrapper[4693]: I1204 09:45:54.467353 4693 status_manager.go:851] "Failed to get status for pod" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" pod="openshift-marketplace/community-operators-l7q9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l7q9x\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:54 crc kubenswrapper[4693]: I1204 09:45:54.467545 4693 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:54 crc kubenswrapper[4693]: I1204 09:45:54.467713 4693 status_manager.go:851] "Failed to get status for pod" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" pod="openshift-marketplace/certified-operators-nnfsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-nnfsl\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:54 crc kubenswrapper[4693]: I1204 09:45:54.467975 4693 status_manager.go:851] "Failed to get status for pod" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" pod="openshift-marketplace/redhat-operators-sjp76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sjp76\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:54 crc kubenswrapper[4693]: I1204 09:45:54.468243 4693 status_manager.go:851] "Failed to get status for pod" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" pod="openshift-marketplace/redhat-marketplace-mdqwg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mdqwg\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:54 crc kubenswrapper[4693]: I1204 09:45:54.468581 4693 status_manager.go:851] "Failed to get status for pod" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:54 crc kubenswrapper[4693]: I1204 09:45:54.468778 4693 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:54 crc kubenswrapper[4693]: I1204 09:45:54.468989 4693 status_manager.go:851] "Failed to get status for pod" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" pod="openshift-marketplace/redhat-operators-wgqzj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wgqzj\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:54 crc kubenswrapper[4693]: I1204 09:45:54.469226 4693 status_manager.go:851] "Failed to get status for pod" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" pod="openshift-marketplace/community-operators-c7zjm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-c7zjm\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:54 crc kubenswrapper[4693]: I1204 09:45:54.469441 4693 status_manager.go:851] "Failed to get status for pod" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" pod="openshift-marketplace/certified-operators-5rl2n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5rl2n\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:55 crc kubenswrapper[4693]: E1204 09:45:55.251892 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:45:55Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:45:55Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:45:55Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:45:55Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[],\\\"sizeBytes\\\":1610175307},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[],\\\"sizeBytes\\\":1205106509},{\\\"names\\\":[],\\\"sizeBytes\\\":1201434959},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:55 crc kubenswrapper[4693]: E1204 09:45:55.252457 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:55 crc kubenswrapper[4693]: E1204 09:45:55.252760 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:55 crc kubenswrapper[4693]: E1204 09:45:55.253200 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:55 crc kubenswrapper[4693]: E1204 09:45:55.253790 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Dec 04 09:45:55 crc kubenswrapper[4693]: E1204 09:45:55.253817 4693 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 09:45:55 crc kubenswrapper[4693]: I1204 09:45:55.265663 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a2d58a7d6d7b4cc34726a90c4ffca64a99f04e4cf95aef30a57c16a0f288dfbb"} Dec 04 09:45:57 crc kubenswrapper[4693]: I1204 09:45:57.205844 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:45:57 crc kubenswrapper[4693]: I1204 09:45:57.210107 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:45:57 crc kubenswrapper[4693]: I1204 09:45:57.281661 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:45:58 crc kubenswrapper[4693]: I1204 09:45:58.289350 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"946c603e3ebc3f101f97ed33cfd52d9ace676dcbd9d8a39e6ce0c1dc260a4dbd"} Dec 04 09:46:04 crc kubenswrapper[4693]: I1204 09:46:04.333295 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bcad64eafceb473a51008b33312020c87c02b6eb877dececc73a7827f9c7c517"} Dec 04 09:46:06 crc kubenswrapper[4693]: I1204 09:46:06.349001 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"00b82b1bf033fa3906b9370dafc88758a8d4fd313f302d294ac40d4741a5249f"} Dec 04 09:46:06 crc kubenswrapper[4693]: I1204 09:46:06.349274 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cb1d50b5d90d5c58c252a80796c07d2e6b7641d3663c2ccd54b40519ebeb6997"} Dec 04 09:46:06 crc kubenswrapper[4693]: I1204 09:46:06.349375 4693 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cd9459fc-2791-47a8-8de4-2ed953140a1a" Dec 04 09:46:06 crc kubenswrapper[4693]: I1204 09:46:06.349404 4693 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cd9459fc-2791-47a8-8de4-2ed953140a1a" Dec 04 09:46:06 crc kubenswrapper[4693]: I1204 09:46:06.349459 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:46:06 crc kubenswrapper[4693]: I1204 09:46:06.357137 4693 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:46:06 crc kubenswrapper[4693]: I1204 09:46:06.366802 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd9459fc-2791-47a8-8de4-2ed953140a1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2d58a7d6d7b4cc34726a90c4ffca64a99f04e4cf95aef30a57c16a0f288dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad64eafceb473a51008b33312020c87c02b6eb877dececc73a7827f9c7c517\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://946c603e3ebc3f101f97ed33cfd52d9ace676dcbd9d8a39e6ce0c1dc260a4dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b82b1bf033fa3906b9370dafc88758a8d4fd313f302d294ac40d4741a5249f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1d50b5d90d5c58c252a80796c07d2e6b7641d3663c2ccd54b40519ebeb6997\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:46:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": pods \"kube-apiserver-crc\" not found" Dec 04 09:46:07 crc kubenswrapper[4693]: I1204 09:46:07.357592 4693 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cd9459fc-2791-47a8-8de4-2ed953140a1a" Dec 04 09:46:07 crc kubenswrapper[4693]: I1204 09:46:07.357897 4693 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cd9459fc-2791-47a8-8de4-2ed953140a1a" Dec 04 09:46:09 crc kubenswrapper[4693]: I1204 09:46:09.085125 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:46:09 crc kubenswrapper[4693]: I1204 09:46:09.108579 4693 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ac79fd71-a9bd-49e1-b82b-8bdc363eb88a" Dec 04 09:46:10 crc kubenswrapper[4693]: I1204 09:46:10.961587 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 04 09:46:11 crc kubenswrapper[4693]: I1204 09:46:11.576999 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 04 09:46:11 crc kubenswrapper[4693]: I1204 09:46:11.682451 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 04 09:46:11 crc kubenswrapper[4693]: I1204 09:46:11.685039 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 04 09:46:11 crc kubenswrapper[4693]: I1204 09:46:11.744185 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 04 09:46:11 crc kubenswrapper[4693]: I1204 09:46:11.871614 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 04 09:46:11 crc kubenswrapper[4693]: I1204 09:46:11.966628 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 04 09:46:12 crc kubenswrapper[4693]: I1204 09:46:12.097512 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 04 09:46:12 crc kubenswrapper[4693]: I1204 09:46:12.338901 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 04 09:46:12 crc kubenswrapper[4693]: I1204 09:46:12.473442 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 04 09:46:12 crc kubenswrapper[4693]: I1204 09:46:12.582709 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 04 09:46:12 crc kubenswrapper[4693]: I1204 09:46:12.598415 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 04 09:46:12 crc kubenswrapper[4693]: I1204 09:46:12.892231 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 04 09:46:12 crc kubenswrapper[4693]: I1204 09:46:12.918955 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 04 09:46:12 crc kubenswrapper[4693]: I1204 09:46:12.942005 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 04 09:46:13 crc kubenswrapper[4693]: I1204 09:46:13.100734 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 04 09:46:13 crc kubenswrapper[4693]: I1204 09:46:13.160394 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 04 09:46:13 crc kubenswrapper[4693]: I1204 09:46:13.287019 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 04 09:46:13 crc kubenswrapper[4693]: I1204 09:46:13.301676 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 04 09:46:13 crc kubenswrapper[4693]: I1204 09:46:13.305269 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 04 09:46:13 crc kubenswrapper[4693]: I1204 09:46:13.446240 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 04 09:46:13 crc kubenswrapper[4693]: I1204 09:46:13.548506 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 04 09:46:13 crc kubenswrapper[4693]: I1204 09:46:13.549534 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 04 09:46:13 crc kubenswrapper[4693]: I1204 09:46:13.657056 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 04 09:46:13 crc kubenswrapper[4693]: I1204 09:46:13.977535 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 04 09:46:14 crc kubenswrapper[4693]: I1204 09:46:14.429783 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 04 09:46:14 crc kubenswrapper[4693]: I1204 09:46:14.483506 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 04 09:46:14 crc kubenswrapper[4693]: I1204 09:46:14.497249 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 04 09:46:14 crc kubenswrapper[4693]: I1204 09:46:14.561404 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 04 09:46:14 crc kubenswrapper[4693]: I1204 09:46:14.576215 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 04 09:46:14 crc kubenswrapper[4693]: I1204 09:46:14.694705 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 04 09:46:14 crc kubenswrapper[4693]: I1204 09:46:14.741526 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 04 09:46:14 crc kubenswrapper[4693]: I1204 09:46:14.803178 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 04 09:46:14 crc kubenswrapper[4693]: I1204 09:46:14.828180 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 04 09:46:14 crc kubenswrapper[4693]: I1204 09:46:14.828180 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 04 09:46:14 crc kubenswrapper[4693]: I1204 09:46:14.837576 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 04 09:46:15 crc kubenswrapper[4693]: I1204 09:46:15.000355 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 04 09:46:15 crc kubenswrapper[4693]: I1204 09:46:15.023714 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 04 09:46:15 crc kubenswrapper[4693]: I1204 09:46:15.090681 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 04 09:46:15 crc kubenswrapper[4693]: I1204 09:46:15.168422 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 04 09:46:15 crc kubenswrapper[4693]: I1204 09:46:15.349759 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 04 09:46:15 crc kubenswrapper[4693]: I1204 09:46:15.433610 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 04 09:46:15 crc kubenswrapper[4693]: I1204 09:46:15.819747 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 04 09:46:15 crc kubenswrapper[4693]: I1204 09:46:15.835886 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 04 09:46:15 crc kubenswrapper[4693]: I1204 09:46:15.835968 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 04 09:46:15 crc kubenswrapper[4693]: I1204 09:46:15.867889 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.005671 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.017739 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.040878 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.133275 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.141126 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.172607 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.212237 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.218888 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.242571 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.242571 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.362453 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.402428 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.534561 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.547516 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.564088 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.675422 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.681562 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.688644 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.743364 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.748132 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.793905 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.803030 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.820571 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.979302 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 04 09:46:16 crc kubenswrapper[4693]: I1204 09:46:16.994427 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.059839 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.070694 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.124953 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.129951 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.145439 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.174428 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.295323 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.354895 4693 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.377183 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.471345 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.570819 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.598951 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.676481 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.677525 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.765025 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.778589 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.803647 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.886868 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.958044 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 04 09:46:17 crc kubenswrapper[4693]: I1204 09:46:17.993170 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 04 09:46:18 crc kubenswrapper[4693]: I1204 09:46:18.064323 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 04 09:46:18 crc kubenswrapper[4693]: I1204 09:46:18.086929 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 04 09:46:18 crc kubenswrapper[4693]: I1204 09:46:18.170780 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 04 09:46:18 crc kubenswrapper[4693]: I1204 09:46:18.209713 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 04 09:46:18 crc kubenswrapper[4693]: I1204 09:46:18.243996 4693 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 04 09:46:18 crc kubenswrapper[4693]: I1204 09:46:18.249139 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 04 09:46:18 crc kubenswrapper[4693]: I1204 09:46:18.386479 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 04 09:46:18 crc kubenswrapper[4693]: I1204 09:46:18.401904 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 04 09:46:18 crc kubenswrapper[4693]: I1204 09:46:18.531080 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 04 09:46:18 crc kubenswrapper[4693]: I1204 09:46:18.566634 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 04 09:46:18 crc kubenswrapper[4693]: I1204 09:46:18.600374 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 04 09:46:18 crc kubenswrapper[4693]: I1204 09:46:18.672426 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 04 09:46:18 crc kubenswrapper[4693]: I1204 09:46:18.715296 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 04 09:46:18 crc kubenswrapper[4693]: I1204 09:46:18.738734 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 04 09:46:18 crc kubenswrapper[4693]: I1204 09:46:18.817242 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 04 09:46:18 crc kubenswrapper[4693]: I1204 09:46:18.954948 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 04 09:46:19 crc kubenswrapper[4693]: I1204 09:46:19.113652 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 04 09:46:19 crc kubenswrapper[4693]: I1204 09:46:19.198323 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 04 09:46:19 crc kubenswrapper[4693]: I1204 09:46:19.207049 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 04 09:46:19 crc kubenswrapper[4693]: I1204 09:46:19.214281 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 04 09:46:19 crc kubenswrapper[4693]: I1204 09:46:19.225387 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 04 09:46:19 crc kubenswrapper[4693]: I1204 09:46:19.268221 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 04 09:46:19 crc kubenswrapper[4693]: I1204 09:46:19.273326 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 04 09:46:19 crc kubenswrapper[4693]: I1204 09:46:19.333414 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 04 09:46:19 crc kubenswrapper[4693]: I1204 09:46:19.407976 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 04 09:46:19 crc kubenswrapper[4693]: I1204 09:46:19.449780 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 04 09:46:19 crc kubenswrapper[4693]: I1204 09:46:19.455318 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 04 09:46:19 crc kubenswrapper[4693]: I1204 09:46:19.585453 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 04 09:46:19 crc kubenswrapper[4693]: I1204 09:46:19.635794 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 04 09:46:19 crc kubenswrapper[4693]: I1204 09:46:19.717202 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 04 09:46:19 crc kubenswrapper[4693]: I1204 09:46:19.753152 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 04 09:46:19 crc kubenswrapper[4693]: I1204 09:46:19.781555 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 04 09:46:19 crc kubenswrapper[4693]: I1204 09:46:19.912751 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 04 09:46:20 crc kubenswrapper[4693]: I1204 09:46:20.068313 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 04 09:46:20 crc kubenswrapper[4693]: I1204 09:46:20.069864 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 04 09:46:20 crc kubenswrapper[4693]: I1204 09:46:20.084596 4693 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 04 09:46:20 crc kubenswrapper[4693]: I1204 09:46:20.098724 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 04 09:46:20 crc kubenswrapper[4693]: I1204 09:46:20.192071 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 04 09:46:20 crc kubenswrapper[4693]: I1204 09:46:20.339578 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 04 09:46:20 crc kubenswrapper[4693]: I1204 09:46:20.353932 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 04 09:46:20 crc kubenswrapper[4693]: I1204 09:46:20.409454 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 04 09:46:20 crc kubenswrapper[4693]: I1204 09:46:20.426930 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 04 09:46:20 crc kubenswrapper[4693]: I1204 09:46:20.473438 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 09:46:20 crc kubenswrapper[4693]: I1204 09:46:20.674217 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 04 09:46:20 crc kubenswrapper[4693]: I1204 09:46:20.780788 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 04 09:46:20 crc kubenswrapper[4693]: I1204 09:46:20.799901 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 09:46:20 crc kubenswrapper[4693]: I1204 09:46:20.800254 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 04 09:46:20 crc kubenswrapper[4693]: I1204 09:46:20.869058 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 04 09:46:20 crc kubenswrapper[4693]: I1204 09:46:20.876284 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 04 09:46:20 crc kubenswrapper[4693]: I1204 09:46:20.898739 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 04 09:46:20 crc kubenswrapper[4693]: I1204 09:46:20.961741 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 04 09:46:21 crc kubenswrapper[4693]: I1204 09:46:21.001081 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 04 09:46:21 crc kubenswrapper[4693]: I1204 09:46:21.237808 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 04 09:46:21 crc kubenswrapper[4693]: I1204 09:46:21.269714 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 04 09:46:21 crc kubenswrapper[4693]: I1204 09:46:21.352006 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 09:46:21 crc kubenswrapper[4693]: I1204 09:46:21.501106 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 04 09:46:21 crc kubenswrapper[4693]: I1204 09:46:21.563379 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 04 09:46:21 crc kubenswrapper[4693]: I1204 09:46:21.760567 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 04 09:46:21 crc kubenswrapper[4693]: I1204 09:46:21.822970 4693 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 04 09:46:21 crc kubenswrapper[4693]: I1204 09:46:21.849167 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 04 09:46:21 crc kubenswrapper[4693]: I1204 09:46:21.878104 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 04 09:46:21 crc kubenswrapper[4693]: I1204 09:46:21.927230 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.008485 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.027898 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.129146 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.158168 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.257998 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.272589 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.272646 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.272693 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.273220 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dc3eb2c670d736e622f5e389adf244a416496825b95513b2ffb4548e91e3ce8c"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.273276 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://dc3eb2c670d736e622f5e389adf244a416496825b95513b2ffb4548e91e3ce8c" gracePeriod=600 Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.316531 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.380639 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.399862 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.480389 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.486602 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.522972 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.596058 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.596118 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.597928 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.757053 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 04 09:46:22 crc kubenswrapper[4693]: I1204 09:46:22.922849 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.023922 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.071457 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.085847 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.085850 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.177037 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.190012 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.204954 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.284263 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.372239 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.413258 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.418787 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.438919 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.482040 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.507469 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.554202 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.569203 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.657244 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.660345 4693 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.694543 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.787046 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.830009 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.947166 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 04 09:46:23 crc kubenswrapper[4693]: I1204 09:46:23.947166 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 04 09:46:24 crc kubenswrapper[4693]: I1204 09:46:24.011419 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 04 09:46:24 crc kubenswrapper[4693]: I1204 09:46:24.109814 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 04 09:46:24 crc kubenswrapper[4693]: I1204 09:46:24.110729 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 04 09:46:24 crc kubenswrapper[4693]: I1204 09:46:24.128166 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 04 09:46:24 crc kubenswrapper[4693]: I1204 09:46:24.238794 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 04 09:46:24 crc kubenswrapper[4693]: I1204 09:46:24.306866 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 04 09:46:24 crc kubenswrapper[4693]: I1204 09:46:24.349217 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 09:46:24 crc kubenswrapper[4693]: I1204 09:46:24.387180 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 09:46:24 crc kubenswrapper[4693]: I1204 09:46:24.539749 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 04 09:46:24 crc kubenswrapper[4693]: I1204 09:46:24.543176 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 04 09:46:24 crc kubenswrapper[4693]: I1204 09:46:24.557143 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 04 09:46:24 crc kubenswrapper[4693]: I1204 09:46:24.595467 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 04 09:46:24 crc kubenswrapper[4693]: I1204 09:46:24.626264 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 04 09:46:24 crc kubenswrapper[4693]: I1204 09:46:24.689070 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 09:46:24 crc kubenswrapper[4693]: I1204 09:46:24.719618 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 04 09:46:24 crc kubenswrapper[4693]: I1204 09:46:24.973464 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 04 09:46:24 crc kubenswrapper[4693]: I1204 09:46:24.974542 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 04 09:46:25 crc kubenswrapper[4693]: I1204 09:46:25.256436 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 09:46:25 crc kubenswrapper[4693]: I1204 09:46:25.430635 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 04 09:46:25 crc kubenswrapper[4693]: I1204 09:46:25.838736 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 04 09:46:25 crc kubenswrapper[4693]: I1204 09:46:25.937294 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 04 09:46:26 crc kubenswrapper[4693]: I1204 09:46:26.300950 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 04 09:46:26 crc kubenswrapper[4693]: I1204 09:46:26.333982 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 04 09:46:26 crc kubenswrapper[4693]: I1204 09:46:26.387481 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 04 09:46:26 crc kubenswrapper[4693]: I1204 09:46:26.407550 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 04 09:46:26 crc kubenswrapper[4693]: I1204 09:46:26.456526 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 04 09:46:26 crc kubenswrapper[4693]: I1204 09:46:26.465147 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="dc3eb2c670d736e622f5e389adf244a416496825b95513b2ffb4548e91e3ce8c" exitCode=0 Dec 04 09:46:26 crc kubenswrapper[4693]: I1204 09:46:26.466747 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"dc3eb2c670d736e622f5e389adf244a416496825b95513b2ffb4548e91e3ce8c"} Dec 04 09:46:26 crc kubenswrapper[4693]: I1204 09:46:26.491024 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 04 09:46:26 crc kubenswrapper[4693]: I1204 09:46:26.517016 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 04 09:46:26 crc kubenswrapper[4693]: I1204 09:46:26.562073 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 04 09:46:26 crc kubenswrapper[4693]: I1204 09:46:26.603378 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 04 09:46:26 crc kubenswrapper[4693]: I1204 09:46:26.952509 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 04 09:46:26 crc kubenswrapper[4693]: I1204 09:46:26.963418 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 04 09:46:27 crc kubenswrapper[4693]: I1204 09:46:27.072895 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 04 09:46:27 crc kubenswrapper[4693]: I1204 09:46:27.864976 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 04 09:46:28 crc kubenswrapper[4693]: I1204 09:46:28.476149 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"e5c3376928d63e92c6ec3a4e0e41e8231361da3b96d1164b5ed95a2e7d0a788d"} Dec 04 09:46:38 crc kubenswrapper[4693]: I1204 09:46:38.160075 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 04 09:46:39 crc kubenswrapper[4693]: I1204 09:46:39.441046 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 04 09:46:40 crc kubenswrapper[4693]: I1204 09:46:40.769774 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 04 09:46:41 crc kubenswrapper[4693]: I1204 09:46:41.186546 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 04 09:46:41 crc kubenswrapper[4693]: I1204 09:46:41.982440 4693 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 04 09:46:41 crc kubenswrapper[4693]: I1204 09:46:41.982626 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c7zjm" podStartSLOduration=68.091448148 podStartE2EDuration="3m14.982582425s" podCreationTimestamp="2025-12-04 09:43:27 +0000 UTC" firstStartedPulling="2025-12-04 09:43:34.695262576 +0000 UTC m=+60.592856329" lastFinishedPulling="2025-12-04 09:45:41.586396853 +0000 UTC m=+187.483990606" observedRunningTime="2025-12-04 09:46:01.615022983 +0000 UTC m=+207.512616736" watchObservedRunningTime="2025-12-04 09:46:41.982582425 +0000 UTC m=+247.880176168" Dec 04 09:46:41 crc kubenswrapper[4693]: I1204 09:46:41.982811 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sjp76" podStartSLOduration=65.115596898 podStartE2EDuration="3m11.982804521s" podCreationTimestamp="2025-12-04 09:43:30 +0000 UTC" firstStartedPulling="2025-12-04 09:43:34.672605557 +0000 UTC m=+60.570199310" lastFinishedPulling="2025-12-04 09:45:41.53981318 +0000 UTC m=+187.437406933" observedRunningTime="2025-12-04 09:46:01.558376442 +0000 UTC m=+207.455970195" watchObservedRunningTime="2025-12-04 09:46:41.982804521 +0000 UTC m=+247.880398274" Dec 04 09:46:41 crc kubenswrapper[4693]: I1204 09:46:41.984124 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l7q9x" podStartSLOduration=68.048385213 podStartE2EDuration="3m14.984115309s" podCreationTimestamp="2025-12-04 09:43:27 +0000 UTC" firstStartedPulling="2025-12-04 09:43:34.697431295 +0000 UTC m=+60.595025048" lastFinishedPulling="2025-12-04 09:45:41.633161391 +0000 UTC m=+187.530755144" observedRunningTime="2025-12-04 09:46:01.517061827 +0000 UTC m=+207.414655580" watchObservedRunningTime="2025-12-04 09:46:41.984115309 +0000 UTC m=+247.881709062" Dec 04 09:46:41 crc kubenswrapper[4693]: I1204 09:46:41.984206 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mdqwg" podStartSLOduration=67.129945263 podStartE2EDuration="3m12.984202241s" podCreationTimestamp="2025-12-04 09:43:29 +0000 UTC" firstStartedPulling="2025-12-04 09:43:34.666744839 +0000 UTC m=+60.564338602" lastFinishedPulling="2025-12-04 09:45:40.521001837 +0000 UTC m=+186.418595580" observedRunningTime="2025-12-04 09:46:01.571586925 +0000 UTC m=+207.469180678" watchObservedRunningTime="2025-12-04 09:46:41.984202241 +0000 UTC m=+247.881795984" Dec 04 09:46:41 crc kubenswrapper[4693]: I1204 09:46:41.984274 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nnfsl" podStartSLOduration=66.686557176 podStartE2EDuration="3m13.984272303s" podCreationTimestamp="2025-12-04 09:43:28 +0000 UTC" firstStartedPulling="2025-12-04 09:43:34.701205446 +0000 UTC m=+60.598799199" lastFinishedPulling="2025-12-04 09:45:41.998920573 +0000 UTC m=+187.896514326" observedRunningTime="2025-12-04 09:46:01.545233382 +0000 UTC m=+207.442827135" watchObservedRunningTime="2025-12-04 09:46:41.984272303 +0000 UTC m=+247.881866056" Dec 04 09:46:41 crc kubenswrapper[4693]: I1204 09:46:41.985571 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5rl2n" podStartSLOduration=68.117272342 podStartE2EDuration="3m14.985565731s" podCreationTimestamp="2025-12-04 09:43:27 +0000 UTC" firstStartedPulling="2025-12-04 09:43:34.700784975 +0000 UTC m=+60.598378728" lastFinishedPulling="2025-12-04 09:45:41.569078364 +0000 UTC m=+187.466672117" observedRunningTime="2025-12-04 09:46:01.486347738 +0000 UTC m=+207.383941491" watchObservedRunningTime="2025-12-04 09:46:41.985565731 +0000 UTC m=+247.883159484" Dec 04 09:46:41 crc kubenswrapper[4693]: I1204 09:46:41.986094 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2d6g4" podStartSLOduration=66.167593029 podStartE2EDuration="3m12.986090715s" podCreationTimestamp="2025-12-04 09:43:29 +0000 UTC" firstStartedPulling="2025-12-04 09:43:34.691463834 +0000 UTC m=+60.589057587" lastFinishedPulling="2025-12-04 09:45:41.50996152 +0000 UTC m=+187.407555273" observedRunningTime="2025-12-04 09:46:01.500208219 +0000 UTC m=+207.397801972" watchObservedRunningTime="2025-12-04 09:46:41.986090715 +0000 UTC m=+247.883684469" Dec 04 09:46:41 crc kubenswrapper[4693]: I1204 09:46:41.986934 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wgqzj" podStartSLOduration=65.201102408 podStartE2EDuration="3m11.98692755s" podCreationTimestamp="2025-12-04 09:43:30 +0000 UTC" firstStartedPulling="2025-12-04 09:43:34.701298709 +0000 UTC m=+60.598892462" lastFinishedPulling="2025-12-04 09:45:41.487123851 +0000 UTC m=+187.384717604" observedRunningTime="2025-12-04 09:46:01.627865994 +0000 UTC m=+207.525459747" watchObservedRunningTime="2025-12-04 09:46:41.98692755 +0000 UTC m=+247.884521313" Dec 04 09:46:41 crc kubenswrapper[4693]: I1204 09:46:41.987450 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 09:46:41 crc kubenswrapper[4693]: I1204 09:46:41.987570 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 09:46:41 crc kubenswrapper[4693]: I1204 09:46:41.987888 4693 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cd9459fc-2791-47a8-8de4-2ed953140a1a" Dec 04 09:46:41 crc kubenswrapper[4693]: I1204 09:46:41.987922 4693 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cd9459fc-2791-47a8-8de4-2ed953140a1a" Dec 04 09:46:41 crc kubenswrapper[4693]: I1204 09:46:41.991993 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:46:42 crc kubenswrapper[4693]: I1204 09:46:42.009689 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=36.009671238 podStartE2EDuration="36.009671238s" podCreationTimestamp="2025-12-04 09:46:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:46:42.006780475 +0000 UTC m=+247.904374228" watchObservedRunningTime="2025-12-04 09:46:42.009671238 +0000 UTC m=+247.907264991" Dec 04 09:46:42 crc kubenswrapper[4693]: I1204 09:46:42.167455 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 04 09:46:42 crc kubenswrapper[4693]: I1204 09:46:42.373139 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 04 09:46:43 crc kubenswrapper[4693]: I1204 09:46:43.247642 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 04 09:46:43 crc kubenswrapper[4693]: I1204 09:46:43.450514 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 04 09:46:45 crc kubenswrapper[4693]: I1204 09:46:45.222567 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl"] Dec 04 09:46:45 crc kubenswrapper[4693]: I1204 09:46:45.223587 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" podUID="68219a06-b58a-4d36-b851-32dd1e4a2ec5" containerName="route-controller-manager" containerID="cri-o://69183b0b7e2df556a610b3cf896e4abfe7e8bfed4dcbf458912594c4f9fdb1c1" gracePeriod=30 Dec 04 09:46:45 crc kubenswrapper[4693]: I1204 09:46:45.227294 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l5tpz"] Dec 04 09:46:45 crc kubenswrapper[4693]: I1204 09:46:45.227562 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" podUID="390724a0-ca5c-4309-93a5-13aa44b32831" containerName="controller-manager" containerID="cri-o://138202350c7d907279449314b3ae20cf646daebd3b4d027e6c73a0dc2788a760" gracePeriod=30 Dec 04 09:46:45 crc kubenswrapper[4693]: I1204 09:46:45.307687 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 04 09:46:45 crc kubenswrapper[4693]: I1204 09:46:45.488463 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:46:45 crc kubenswrapper[4693]: I1204 09:46:45.488597 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:46:45 crc kubenswrapper[4693]: I1204 09:46:45.493306 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:46:45 crc kubenswrapper[4693]: I1204 09:46:45.566268 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:46:45 crc kubenswrapper[4693]: I1204 09:46:45.934305 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jznlz"] Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.569293 4693 generic.go:334] "Generic (PLEG): container finished" podID="68219a06-b58a-4d36-b851-32dd1e4a2ec5" containerID="69183b0b7e2df556a610b3cf896e4abfe7e8bfed4dcbf458912594c4f9fdb1c1" exitCode=0 Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.569366 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" event={"ID":"68219a06-b58a-4d36-b851-32dd1e4a2ec5","Type":"ContainerDied","Data":"69183b0b7e2df556a610b3cf896e4abfe7e8bfed4dcbf458912594c4f9fdb1c1"} Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.573124 4693 generic.go:334] "Generic (PLEG): container finished" podID="390724a0-ca5c-4309-93a5-13aa44b32831" containerID="138202350c7d907279449314b3ae20cf646daebd3b4d027e6c73a0dc2788a760" exitCode=0 Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.573250 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" event={"ID":"390724a0-ca5c-4309-93a5-13aa44b32831","Type":"ContainerDied","Data":"138202350c7d907279449314b3ae20cf646daebd3b4d027e6c73a0dc2788a760"} Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.846932 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.853052 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.947643 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68219a06-b58a-4d36-b851-32dd1e4a2ec5-serving-cert\") pod \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\" (UID: \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\") " Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.947703 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc67k\" (UniqueName: \"kubernetes.io/projected/390724a0-ca5c-4309-93a5-13aa44b32831-kube-api-access-gc67k\") pod \"390724a0-ca5c-4309-93a5-13aa44b32831\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.947733 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/390724a0-ca5c-4309-93a5-13aa44b32831-proxy-ca-bundles\") pod \"390724a0-ca5c-4309-93a5-13aa44b32831\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.947756 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/390724a0-ca5c-4309-93a5-13aa44b32831-serving-cert\") pod \"390724a0-ca5c-4309-93a5-13aa44b32831\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.947773 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/390724a0-ca5c-4309-93a5-13aa44b32831-config\") pod \"390724a0-ca5c-4309-93a5-13aa44b32831\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.947806 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/390724a0-ca5c-4309-93a5-13aa44b32831-client-ca\") pod \"390724a0-ca5c-4309-93a5-13aa44b32831\" (UID: \"390724a0-ca5c-4309-93a5-13aa44b32831\") " Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.947852 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68219a06-b58a-4d36-b851-32dd1e4a2ec5-client-ca\") pod \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\" (UID: \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\") " Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.947873 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68219a06-b58a-4d36-b851-32dd1e4a2ec5-config\") pod \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\" (UID: \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\") " Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.947893 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp6xh\" (UniqueName: \"kubernetes.io/projected/68219a06-b58a-4d36-b851-32dd1e4a2ec5-kube-api-access-dp6xh\") pod \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\" (UID: \"68219a06-b58a-4d36-b851-32dd1e4a2ec5\") " Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.948489 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68219a06-b58a-4d36-b851-32dd1e4a2ec5-client-ca" (OuterVolumeSpecName: "client-ca") pod "68219a06-b58a-4d36-b851-32dd1e4a2ec5" (UID: "68219a06-b58a-4d36-b851-32dd1e4a2ec5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.948507 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68219a06-b58a-4d36-b851-32dd1e4a2ec5-config" (OuterVolumeSpecName: "config") pod "68219a06-b58a-4d36-b851-32dd1e4a2ec5" (UID: "68219a06-b58a-4d36-b851-32dd1e4a2ec5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.948997 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/390724a0-ca5c-4309-93a5-13aa44b32831-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "390724a0-ca5c-4309-93a5-13aa44b32831" (UID: "390724a0-ca5c-4309-93a5-13aa44b32831"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.949144 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/390724a0-ca5c-4309-93a5-13aa44b32831-client-ca" (OuterVolumeSpecName: "client-ca") pod "390724a0-ca5c-4309-93a5-13aa44b32831" (UID: "390724a0-ca5c-4309-93a5-13aa44b32831"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.949371 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/390724a0-ca5c-4309-93a5-13aa44b32831-config" (OuterVolumeSpecName: "config") pod "390724a0-ca5c-4309-93a5-13aa44b32831" (UID: "390724a0-ca5c-4309-93a5-13aa44b32831"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.954724 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68219a06-b58a-4d36-b851-32dd1e4a2ec5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "68219a06-b58a-4d36-b851-32dd1e4a2ec5" (UID: "68219a06-b58a-4d36-b851-32dd1e4a2ec5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.954738 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/390724a0-ca5c-4309-93a5-13aa44b32831-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "390724a0-ca5c-4309-93a5-13aa44b32831" (UID: "390724a0-ca5c-4309-93a5-13aa44b32831"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.955261 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/390724a0-ca5c-4309-93a5-13aa44b32831-kube-api-access-gc67k" (OuterVolumeSpecName: "kube-api-access-gc67k") pod "390724a0-ca5c-4309-93a5-13aa44b32831" (UID: "390724a0-ca5c-4309-93a5-13aa44b32831"). InnerVolumeSpecName "kube-api-access-gc67k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:46:46 crc kubenswrapper[4693]: I1204 09:46:46.955680 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68219a06-b58a-4d36-b851-32dd1e4a2ec5-kube-api-access-dp6xh" (OuterVolumeSpecName: "kube-api-access-dp6xh") pod "68219a06-b58a-4d36-b851-32dd1e4a2ec5" (UID: "68219a06-b58a-4d36-b851-32dd1e4a2ec5"). InnerVolumeSpecName "kube-api-access-dp6xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.049045 4693 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68219a06-b58a-4d36-b851-32dd1e4a2ec5-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.049083 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68219a06-b58a-4d36-b851-32dd1e4a2ec5-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.049092 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp6xh\" (UniqueName: \"kubernetes.io/projected/68219a06-b58a-4d36-b851-32dd1e4a2ec5-kube-api-access-dp6xh\") on node \"crc\" DevicePath \"\"" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.049102 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68219a06-b58a-4d36-b851-32dd1e4a2ec5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.049110 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc67k\" (UniqueName: \"kubernetes.io/projected/390724a0-ca5c-4309-93a5-13aa44b32831-kube-api-access-gc67k\") on node \"crc\" DevicePath \"\"" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.049118 4693 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/390724a0-ca5c-4309-93a5-13aa44b32831-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.049127 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/390724a0-ca5c-4309-93a5-13aa44b32831-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.049134 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/390724a0-ca5c-4309-93a5-13aa44b32831-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.049143 4693 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/390724a0-ca5c-4309-93a5-13aa44b32831-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.535876 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f894987c-29svd"] Dec 04 09:46:47 crc kubenswrapper[4693]: E1204 09:46:47.536107 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" containerName="installer" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.536121 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" containerName="installer" Dec 04 09:46:47 crc kubenswrapper[4693]: E1204 09:46:47.536145 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68219a06-b58a-4d36-b851-32dd1e4a2ec5" containerName="route-controller-manager" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.536151 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="68219a06-b58a-4d36-b851-32dd1e4a2ec5" containerName="route-controller-manager" Dec 04 09:46:47 crc kubenswrapper[4693]: E1204 09:46:47.536161 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390724a0-ca5c-4309-93a5-13aa44b32831" containerName="controller-manager" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.536167 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="390724a0-ca5c-4309-93a5-13aa44b32831" containerName="controller-manager" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.536256 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a72ca277-b61f-4242-9e81-0bbabbffab52" containerName="installer" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.536268 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="390724a0-ca5c-4309-93a5-13aa44b32831" containerName="controller-manager" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.536279 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="68219a06-b58a-4d36-b851-32dd1e4a2ec5" containerName="route-controller-manager" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.536652 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.548795 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f894987c-29svd"] Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.588455 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" event={"ID":"68219a06-b58a-4d36-b851-32dd1e4a2ec5","Type":"ContainerDied","Data":"8e21fa0f0d1ae5a6f363f2557d09422747dff2e715af252b826e3e78292a54a1"} Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.588505 4693 scope.go:117] "RemoveContainer" containerID="69183b0b7e2df556a610b3cf896e4abfe7e8bfed4dcbf458912594c4f9fdb1c1" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.588610 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.592905 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" event={"ID":"390724a0-ca5c-4309-93a5-13aa44b32831","Type":"ContainerDied","Data":"fd66efc75ac9179220a8380e1f18f7a670e031731e81f5b56b2a351019ad53ec"} Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.593002 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l5tpz" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.605868 4693 scope.go:117] "RemoveContainer" containerID="138202350c7d907279449314b3ae20cf646daebd3b4d027e6c73a0dc2788a760" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.629035 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l5tpz"] Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.632579 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l5tpz"] Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.638714 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl"] Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.641913 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5dnpl"] Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.657169 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/091ad27f-8755-44ef-9961-1332f8f83860-proxy-ca-bundles\") pod \"controller-manager-f894987c-29svd\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.657238 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/091ad27f-8755-44ef-9961-1332f8f83860-config\") pod \"controller-manager-f894987c-29svd\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.657264 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/091ad27f-8755-44ef-9961-1332f8f83860-client-ca\") pod \"controller-manager-f894987c-29svd\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.657347 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/091ad27f-8755-44ef-9961-1332f8f83860-serving-cert\") pod \"controller-manager-f894987c-29svd\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.657381 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxzx6\" (UniqueName: \"kubernetes.io/projected/091ad27f-8755-44ef-9961-1332f8f83860-kube-api-access-fxzx6\") pod \"controller-manager-f894987c-29svd\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.758418 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/091ad27f-8755-44ef-9961-1332f8f83860-serving-cert\") pod \"controller-manager-f894987c-29svd\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.758745 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxzx6\" (UniqueName: \"kubernetes.io/projected/091ad27f-8755-44ef-9961-1332f8f83860-kube-api-access-fxzx6\") pod \"controller-manager-f894987c-29svd\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.758793 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/091ad27f-8755-44ef-9961-1332f8f83860-proxy-ca-bundles\") pod \"controller-manager-f894987c-29svd\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.758814 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/091ad27f-8755-44ef-9961-1332f8f83860-config\") pod \"controller-manager-f894987c-29svd\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.758837 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/091ad27f-8755-44ef-9961-1332f8f83860-client-ca\") pod \"controller-manager-f894987c-29svd\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.759835 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/091ad27f-8755-44ef-9961-1332f8f83860-client-ca\") pod \"controller-manager-f894987c-29svd\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.760360 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/091ad27f-8755-44ef-9961-1332f8f83860-config\") pod \"controller-manager-f894987c-29svd\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.760895 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/091ad27f-8755-44ef-9961-1332f8f83860-proxy-ca-bundles\") pod \"controller-manager-f894987c-29svd\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.763567 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/091ad27f-8755-44ef-9961-1332f8f83860-serving-cert\") pod \"controller-manager-f894987c-29svd\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.775379 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxzx6\" (UniqueName: \"kubernetes.io/projected/091ad27f-8755-44ef-9961-1332f8f83860-kube-api-access-fxzx6\") pod \"controller-manager-f894987c-29svd\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:46:47 crc kubenswrapper[4693]: I1204 09:46:47.902366 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:46:48 crc kubenswrapper[4693]: I1204 09:46:48.289086 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f894987c-29svd"] Dec 04 09:46:48 crc kubenswrapper[4693]: I1204 09:46:48.469630 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="390724a0-ca5c-4309-93a5-13aa44b32831" path="/var/lib/kubelet/pods/390724a0-ca5c-4309-93a5-13aa44b32831/volumes" Dec 04 09:46:48 crc kubenswrapper[4693]: I1204 09:46:48.471054 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68219a06-b58a-4d36-b851-32dd1e4a2ec5" path="/var/lib/kubelet/pods/68219a06-b58a-4d36-b851-32dd1e4a2ec5/volumes" Dec 04 09:46:48 crc kubenswrapper[4693]: I1204 09:46:48.605248 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f894987c-29svd" event={"ID":"091ad27f-8755-44ef-9961-1332f8f83860","Type":"ContainerStarted","Data":"7d883f2d9421411e68404619f33490e6949a61ab4568615b6c8b92a3f6b00dfa"} Dec 04 09:46:48 crc kubenswrapper[4693]: I1204 09:46:48.693133 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.559272 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh"] Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.560414 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.562106 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh"] Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.564824 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.565392 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.565793 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.565964 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.566124 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.567850 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.582257 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b3bd4b7-8006-4c14-9e79-eae945783153-config\") pod \"route-controller-manager-5d6dd5fcf8-4wcjh\" (UID: \"1b3bd4b7-8006-4c14-9e79-eae945783153\") " pod="openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.582326 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nncjz\" (UniqueName: \"kubernetes.io/projected/1b3bd4b7-8006-4c14-9e79-eae945783153-kube-api-access-nncjz\") pod \"route-controller-manager-5d6dd5fcf8-4wcjh\" (UID: \"1b3bd4b7-8006-4c14-9e79-eae945783153\") " pod="openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.582407 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b3bd4b7-8006-4c14-9e79-eae945783153-client-ca\") pod \"route-controller-manager-5d6dd5fcf8-4wcjh\" (UID: \"1b3bd4b7-8006-4c14-9e79-eae945783153\") " pod="openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.582454 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b3bd4b7-8006-4c14-9e79-eae945783153-serving-cert\") pod \"route-controller-manager-5d6dd5fcf8-4wcjh\" (UID: \"1b3bd4b7-8006-4c14-9e79-eae945783153\") " pod="openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.648225 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f894987c-29svd" event={"ID":"091ad27f-8755-44ef-9961-1332f8f83860","Type":"ContainerStarted","Data":"ad110ea895a1292b75a713cb9031256b6f5849c41585cbd4b38a768209840cd0"} Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.648481 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.663225 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.683405 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b3bd4b7-8006-4c14-9e79-eae945783153-client-ca\") pod \"route-controller-manager-5d6dd5fcf8-4wcjh\" (UID: \"1b3bd4b7-8006-4c14-9e79-eae945783153\") " pod="openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.683488 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b3bd4b7-8006-4c14-9e79-eae945783153-serving-cert\") pod \"route-controller-manager-5d6dd5fcf8-4wcjh\" (UID: \"1b3bd4b7-8006-4c14-9e79-eae945783153\") " pod="openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.683514 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b3bd4b7-8006-4c14-9e79-eae945783153-config\") pod \"route-controller-manager-5d6dd5fcf8-4wcjh\" (UID: \"1b3bd4b7-8006-4c14-9e79-eae945783153\") " pod="openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.683553 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nncjz\" (UniqueName: \"kubernetes.io/projected/1b3bd4b7-8006-4c14-9e79-eae945783153-kube-api-access-nncjz\") pod \"route-controller-manager-5d6dd5fcf8-4wcjh\" (UID: \"1b3bd4b7-8006-4c14-9e79-eae945783153\") " pod="openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.684738 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b3bd4b7-8006-4c14-9e79-eae945783153-client-ca\") pod \"route-controller-manager-5d6dd5fcf8-4wcjh\" (UID: \"1b3bd4b7-8006-4c14-9e79-eae945783153\") " pod="openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.684846 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b3bd4b7-8006-4c14-9e79-eae945783153-config\") pod \"route-controller-manager-5d6dd5fcf8-4wcjh\" (UID: \"1b3bd4b7-8006-4c14-9e79-eae945783153\") " pod="openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.689678 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b3bd4b7-8006-4c14-9e79-eae945783153-serving-cert\") pod \"route-controller-manager-5d6dd5fcf8-4wcjh\" (UID: \"1b3bd4b7-8006-4c14-9e79-eae945783153\") " pod="openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.706260 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f894987c-29svd" podStartSLOduration=4.706239923 podStartE2EDuration="4.706239923s" podCreationTimestamp="2025-12-04 09:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:46:49.702578408 +0000 UTC m=+255.600172161" watchObservedRunningTime="2025-12-04 09:46:49.706239923 +0000 UTC m=+255.603833676" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.727010 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nncjz\" (UniqueName: \"kubernetes.io/projected/1b3bd4b7-8006-4c14-9e79-eae945783153-kube-api-access-nncjz\") pod \"route-controller-manager-5d6dd5fcf8-4wcjh\" (UID: \"1b3bd4b7-8006-4c14-9e79-eae945783153\") " pod="openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh" Dec 04 09:46:49 crc kubenswrapper[4693]: I1204 09:46:49.877572 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh" Dec 04 09:46:50 crc kubenswrapper[4693]: I1204 09:46:50.377621 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh"] Dec 04 09:46:50 crc kubenswrapper[4693]: W1204 09:46:50.388275 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b3bd4b7_8006_4c14_9e79_eae945783153.slice/crio-dbce8c31394cb208d03f09f876aad30d1b90f1e295f7c903456846169a017de0 WatchSource:0}: Error finding container dbce8c31394cb208d03f09f876aad30d1b90f1e295f7c903456846169a017de0: Status 404 returned error can't find the container with id dbce8c31394cb208d03f09f876aad30d1b90f1e295f7c903456846169a017de0 Dec 04 09:46:50 crc kubenswrapper[4693]: I1204 09:46:50.656592 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh" event={"ID":"1b3bd4b7-8006-4c14-9e79-eae945783153","Type":"ContainerStarted","Data":"dbce8c31394cb208d03f09f876aad30d1b90f1e295f7c903456846169a017de0"} Dec 04 09:46:50 crc kubenswrapper[4693]: I1204 09:46:50.912550 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 04 09:46:50 crc kubenswrapper[4693]: I1204 09:46:50.935087 4693 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 09:46:50 crc kubenswrapper[4693]: I1204 09:46:50.935309 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://838d029d10876585be32f081854df97ff2cc484d18c24834321f465c3362a503" gracePeriod=5 Dec 04 09:46:52 crc kubenswrapper[4693]: I1204 09:46:52.666411 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh" event={"ID":"1b3bd4b7-8006-4c14-9e79-eae945783153","Type":"ContainerStarted","Data":"da6f09fa4555902c4261cc0cc4107a3ae94d26de6aa12f62cb7f1c85de36f370"} Dec 04 09:46:52 crc kubenswrapper[4693]: I1204 09:46:52.666717 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh" Dec 04 09:46:52 crc kubenswrapper[4693]: I1204 09:46:52.674460 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh" Dec 04 09:46:52 crc kubenswrapper[4693]: I1204 09:46:52.684625 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d6dd5fcf8-4wcjh" podStartSLOduration=7.684607399 podStartE2EDuration="7.684607399s" podCreationTimestamp="2025-12-04 09:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:46:52.68256382 +0000 UTC m=+258.580157583" watchObservedRunningTime="2025-12-04 09:46:52.684607399 +0000 UTC m=+258.582201152" Dec 04 09:46:52 crc kubenswrapper[4693]: I1204 09:46:52.740713 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 04 09:46:55 crc kubenswrapper[4693]: I1204 09:46:55.199856 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 04 09:46:55 crc kubenswrapper[4693]: I1204 09:46:55.420989 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 04 09:46:56 crc kubenswrapper[4693]: I1204 09:46:56.034681 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 09:46:56 crc kubenswrapper[4693]: I1204 09:46:56.611492 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 04 09:46:56 crc kubenswrapper[4693]: I1204 09:46:56.696650 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 04 09:46:56 crc kubenswrapper[4693]: I1204 09:46:56.855794 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 04 09:46:57 crc kubenswrapper[4693]: I1204 09:46:57.632093 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 04 09:46:57 crc kubenswrapper[4693]: I1204 09:46:57.700010 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 04 09:46:57 crc kubenswrapper[4693]: I1204 09:46:57.700071 4693 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="838d029d10876585be32f081854df97ff2cc484d18c24834321f465c3362a503" exitCode=137 Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.060416 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.706738 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.706819 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7837727dc09901aaecf550235cab9d32dcd9562106cf3ea867aec6cc409bc02d" Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.734184 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.734280 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.832783 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.832847 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.832919 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.832959 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.832983 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.833023 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.833053 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.833068 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.833039 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.833525 4693 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.833544 4693 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.833558 4693 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.833570 4693 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.843106 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:46:58 crc kubenswrapper[4693]: I1204 09:46:58.935137 4693 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:46:59 crc kubenswrapper[4693]: I1204 09:46:59.646880 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 04 09:46:59 crc kubenswrapper[4693]: I1204 09:46:59.711549 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:47:00 crc kubenswrapper[4693]: I1204 09:47:00.434473 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 04 09:47:00 crc kubenswrapper[4693]: I1204 09:47:00.467272 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 04 09:47:02 crc kubenswrapper[4693]: I1204 09:47:02.400542 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 04 09:47:02 crc kubenswrapper[4693]: I1204 09:47:02.521651 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 04 09:47:04 crc kubenswrapper[4693]: I1204 09:47:04.515849 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l7q9x"] Dec 04 09:47:04 crc kubenswrapper[4693]: I1204 09:47:04.516452 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l7q9x" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" containerName="registry-server" containerID="cri-o://a71348ae53add427709b4a8def892e7afc554f50c8b53fe8790527ac68cae297" gracePeriod=2 Dec 04 09:47:04 crc kubenswrapper[4693]: I1204 09:47:04.723110 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqwg"] Dec 04 09:47:04 crc kubenswrapper[4693]: I1204 09:47:04.723535 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mdqwg" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" containerName="registry-server" containerID="cri-o://60100c3de7522efcb8c647299a25d3e0a84651e81001ea3931e7f138ff77ddf0" gracePeriod=2 Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.161882 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f894987c-29svd"] Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.162449 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-f894987c-29svd" podUID="091ad27f-8755-44ef-9961-1332f8f83860" containerName="controller-manager" containerID="cri-o://ad110ea895a1292b75a713cb9031256b6f5849c41585cbd4b38a768209840cd0" gracePeriod=30 Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.731056 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdqwg" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.741580 4693 generic.go:334] "Generic (PLEG): container finished" podID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" containerID="a71348ae53add427709b4a8def892e7afc554f50c8b53fe8790527ac68cae297" exitCode=0 Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.741625 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7q9x" event={"ID":"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5","Type":"ContainerDied","Data":"a71348ae53add427709b4a8def892e7afc554f50c8b53fe8790527ac68cae297"} Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.751176 4693 generic.go:334] "Generic (PLEG): container finished" podID="091ad27f-8755-44ef-9961-1332f8f83860" containerID="ad110ea895a1292b75a713cb9031256b6f5849c41585cbd4b38a768209840cd0" exitCode=0 Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.751316 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f894987c-29svd" event={"ID":"091ad27f-8755-44ef-9961-1332f8f83860","Type":"ContainerDied","Data":"ad110ea895a1292b75a713cb9031256b6f5849c41585cbd4b38a768209840cd0"} Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.755049 4693 generic.go:334] "Generic (PLEG): container finished" podID="de8930f8-8514-4179-a3b6-3408199d5cd8" containerID="60100c3de7522efcb8c647299a25d3e0a84651e81001ea3931e7f138ff77ddf0" exitCode=0 Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.755127 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqwg" event={"ID":"de8930f8-8514-4179-a3b6-3408199d5cd8","Type":"ContainerDied","Data":"60100c3de7522efcb8c647299a25d3e0a84651e81001ea3931e7f138ff77ddf0"} Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.755186 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqwg" event={"ID":"de8930f8-8514-4179-a3b6-3408199d5cd8","Type":"ContainerDied","Data":"895add6ef70cf2ce83e6db1ef6d5dacb17317a0d27166c401644494af2c9bfbc"} Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.755235 4693 scope.go:117] "RemoveContainer" containerID="60100c3de7522efcb8c647299a25d3e0a84651e81001ea3931e7f138ff77ddf0" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.755468 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdqwg" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.816772 4693 scope.go:117] "RemoveContainer" containerID="82a5053a29cc2cc789ed63e8c5169636dcb16d3010ba13cb1e4731ddb9b4f609" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.819433 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.824081 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de8930f8-8514-4179-a3b6-3408199d5cd8-catalog-content\") pod \"de8930f8-8514-4179-a3b6-3408199d5cd8\" (UID: \"de8930f8-8514-4179-a3b6-3408199d5cd8\") " Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.824151 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de8930f8-8514-4179-a3b6-3408199d5cd8-utilities\") pod \"de8930f8-8514-4179-a3b6-3408199d5cd8\" (UID: \"de8930f8-8514-4179-a3b6-3408199d5cd8\") " Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.824246 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m2pw\" (UniqueName: \"kubernetes.io/projected/de8930f8-8514-4179-a3b6-3408199d5cd8-kube-api-access-5m2pw\") pod \"de8930f8-8514-4179-a3b6-3408199d5cd8\" (UID: \"de8930f8-8514-4179-a3b6-3408199d5cd8\") " Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.825937 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de8930f8-8514-4179-a3b6-3408199d5cd8-utilities" (OuterVolumeSpecName: "utilities") pod "de8930f8-8514-4179-a3b6-3408199d5cd8" (UID: "de8930f8-8514-4179-a3b6-3408199d5cd8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.830672 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de8930f8-8514-4179-a3b6-3408199d5cd8-kube-api-access-5m2pw" (OuterVolumeSpecName: "kube-api-access-5m2pw") pod "de8930f8-8514-4179-a3b6-3408199d5cd8" (UID: "de8930f8-8514-4179-a3b6-3408199d5cd8"). InnerVolumeSpecName "kube-api-access-5m2pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.845679 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de8930f8-8514-4179-a3b6-3408199d5cd8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de8930f8-8514-4179-a3b6-3408199d5cd8" (UID: "de8930f8-8514-4179-a3b6-3408199d5cd8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.847535 4693 scope.go:117] "RemoveContainer" containerID="b659b33da8fc2ed3060af69e984702c33007de096e9d80243a0013164a591183" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.861804 4693 scope.go:117] "RemoveContainer" containerID="60100c3de7522efcb8c647299a25d3e0a84651e81001ea3931e7f138ff77ddf0" Dec 04 09:47:05 crc kubenswrapper[4693]: E1204 09:47:05.862293 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60100c3de7522efcb8c647299a25d3e0a84651e81001ea3931e7f138ff77ddf0\": container with ID starting with 60100c3de7522efcb8c647299a25d3e0a84651e81001ea3931e7f138ff77ddf0 not found: ID does not exist" containerID="60100c3de7522efcb8c647299a25d3e0a84651e81001ea3931e7f138ff77ddf0" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.862391 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60100c3de7522efcb8c647299a25d3e0a84651e81001ea3931e7f138ff77ddf0"} err="failed to get container status \"60100c3de7522efcb8c647299a25d3e0a84651e81001ea3931e7f138ff77ddf0\": rpc error: code = NotFound desc = could not find container \"60100c3de7522efcb8c647299a25d3e0a84651e81001ea3931e7f138ff77ddf0\": container with ID starting with 60100c3de7522efcb8c647299a25d3e0a84651e81001ea3931e7f138ff77ddf0 not found: ID does not exist" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.862426 4693 scope.go:117] "RemoveContainer" containerID="82a5053a29cc2cc789ed63e8c5169636dcb16d3010ba13cb1e4731ddb9b4f609" Dec 04 09:47:05 crc kubenswrapper[4693]: E1204 09:47:05.862794 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82a5053a29cc2cc789ed63e8c5169636dcb16d3010ba13cb1e4731ddb9b4f609\": container with ID starting with 82a5053a29cc2cc789ed63e8c5169636dcb16d3010ba13cb1e4731ddb9b4f609 not found: ID does not exist" containerID="82a5053a29cc2cc789ed63e8c5169636dcb16d3010ba13cb1e4731ddb9b4f609" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.862839 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82a5053a29cc2cc789ed63e8c5169636dcb16d3010ba13cb1e4731ddb9b4f609"} err="failed to get container status \"82a5053a29cc2cc789ed63e8c5169636dcb16d3010ba13cb1e4731ddb9b4f609\": rpc error: code = NotFound desc = could not find container \"82a5053a29cc2cc789ed63e8c5169636dcb16d3010ba13cb1e4731ddb9b4f609\": container with ID starting with 82a5053a29cc2cc789ed63e8c5169636dcb16d3010ba13cb1e4731ddb9b4f609 not found: ID does not exist" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.862887 4693 scope.go:117] "RemoveContainer" containerID="b659b33da8fc2ed3060af69e984702c33007de096e9d80243a0013164a591183" Dec 04 09:47:05 crc kubenswrapper[4693]: E1204 09:47:05.863178 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b659b33da8fc2ed3060af69e984702c33007de096e9d80243a0013164a591183\": container with ID starting with b659b33da8fc2ed3060af69e984702c33007de096e9d80243a0013164a591183 not found: ID does not exist" containerID="b659b33da8fc2ed3060af69e984702c33007de096e9d80243a0013164a591183" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.863214 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b659b33da8fc2ed3060af69e984702c33007de096e9d80243a0013164a591183"} err="failed to get container status \"b659b33da8fc2ed3060af69e984702c33007de096e9d80243a0013164a591183\": rpc error: code = NotFound desc = could not find container \"b659b33da8fc2ed3060af69e984702c33007de096e9d80243a0013164a591183\": container with ID starting with b659b33da8fc2ed3060af69e984702c33007de096e9d80243a0013164a591183 not found: ID does not exist" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.926762 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/091ad27f-8755-44ef-9961-1332f8f83860-serving-cert\") pod \"091ad27f-8755-44ef-9961-1332f8f83860\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.926836 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/091ad27f-8755-44ef-9961-1332f8f83860-config\") pod \"091ad27f-8755-44ef-9961-1332f8f83860\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.926873 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/091ad27f-8755-44ef-9961-1332f8f83860-proxy-ca-bundles\") pod \"091ad27f-8755-44ef-9961-1332f8f83860\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.926949 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/091ad27f-8755-44ef-9961-1332f8f83860-client-ca\") pod \"091ad27f-8755-44ef-9961-1332f8f83860\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.927007 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxzx6\" (UniqueName: \"kubernetes.io/projected/091ad27f-8755-44ef-9961-1332f8f83860-kube-api-access-fxzx6\") pod \"091ad27f-8755-44ef-9961-1332f8f83860\" (UID: \"091ad27f-8755-44ef-9961-1332f8f83860\") " Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.927261 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de8930f8-8514-4179-a3b6-3408199d5cd8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.927286 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de8930f8-8514-4179-a3b6-3408199d5cd8-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.927298 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m2pw\" (UniqueName: \"kubernetes.io/projected/de8930f8-8514-4179-a3b6-3408199d5cd8-kube-api-access-5m2pw\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.927975 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/091ad27f-8755-44ef-9961-1332f8f83860-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "091ad27f-8755-44ef-9961-1332f8f83860" (UID: "091ad27f-8755-44ef-9961-1332f8f83860"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.928362 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/091ad27f-8755-44ef-9961-1332f8f83860-client-ca" (OuterVolumeSpecName: "client-ca") pod "091ad27f-8755-44ef-9961-1332f8f83860" (UID: "091ad27f-8755-44ef-9961-1332f8f83860"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.928572 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/091ad27f-8755-44ef-9961-1332f8f83860-config" (OuterVolumeSpecName: "config") pod "091ad27f-8755-44ef-9961-1332f8f83860" (UID: "091ad27f-8755-44ef-9961-1332f8f83860"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.932071 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/091ad27f-8755-44ef-9961-1332f8f83860-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "091ad27f-8755-44ef-9961-1332f8f83860" (UID: "091ad27f-8755-44ef-9961-1332f8f83860"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:47:05 crc kubenswrapper[4693]: I1204 09:47:05.934130 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/091ad27f-8755-44ef-9961-1332f8f83860-kube-api-access-fxzx6" (OuterVolumeSpecName: "kube-api-access-fxzx6") pod "091ad27f-8755-44ef-9961-1332f8f83860" (UID: "091ad27f-8755-44ef-9961-1332f8f83860"). InnerVolumeSpecName "kube-api-access-fxzx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.028622 4693 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/091ad27f-8755-44ef-9961-1332f8f83860-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.028665 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxzx6\" (UniqueName: \"kubernetes.io/projected/091ad27f-8755-44ef-9961-1332f8f83860-kube-api-access-fxzx6\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.028677 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/091ad27f-8755-44ef-9961-1332f8f83860-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.028686 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/091ad27f-8755-44ef-9961-1332f8f83860-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.028694 4693 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/091ad27f-8755-44ef-9961-1332f8f83860-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.081242 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqwg"] Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.085668 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqwg"] Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.352675 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7q9x" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.432081 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znt6q\" (UniqueName: \"kubernetes.io/projected/6a3f510e-344a-4433-bfbb-ed6c76c3a4b5-kube-api-access-znt6q\") pod \"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5\" (UID: \"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5\") " Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.432125 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3f510e-344a-4433-bfbb-ed6c76c3a4b5-catalog-content\") pod \"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5\" (UID: \"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5\") " Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.432238 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3f510e-344a-4433-bfbb-ed6c76c3a4b5-utilities\") pod \"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5\" (UID: \"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5\") " Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.433256 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3f510e-344a-4433-bfbb-ed6c76c3a4b5-utilities" (OuterVolumeSpecName: "utilities") pod "6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" (UID: "6a3f510e-344a-4433-bfbb-ed6c76c3a4b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.438613 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a3f510e-344a-4433-bfbb-ed6c76c3a4b5-kube-api-access-znt6q" (OuterVolumeSpecName: "kube-api-access-znt6q") pod "6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" (UID: "6a3f510e-344a-4433-bfbb-ed6c76c3a4b5"). InnerVolumeSpecName "kube-api-access-znt6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.468206 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" path="/var/lib/kubelet/pods/de8930f8-8514-4179-a3b6-3408199d5cd8/volumes" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.484361 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3f510e-344a-4433-bfbb-ed6c76c3a4b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" (UID: "6a3f510e-344a-4433-bfbb-ed6c76c3a4b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.533363 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znt6q\" (UniqueName: \"kubernetes.io/projected/6a3f510e-344a-4433-bfbb-ed6c76c3a4b5-kube-api-access-znt6q\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.533404 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3f510e-344a-4433-bfbb-ed6c76c3a4b5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.533413 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3f510e-344a-4433-bfbb-ed6c76c3a4b5-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.558479 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56c45b45c8-jsztf"] Dec 04 09:47:06 crc kubenswrapper[4693]: E1204 09:47:06.558741 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" containerName="registry-server" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.558754 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" containerName="registry-server" Dec 04 09:47:06 crc kubenswrapper[4693]: E1204 09:47:06.558769 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" containerName="registry-server" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.558776 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" containerName="registry-server" Dec 04 09:47:06 crc kubenswrapper[4693]: E1204 09:47:06.558787 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" containerName="extract-utilities" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.558794 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" containerName="extract-utilities" Dec 04 09:47:06 crc kubenswrapper[4693]: E1204 09:47:06.558801 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.558815 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 04 09:47:06 crc kubenswrapper[4693]: E1204 09:47:06.558826 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" containerName="extract-utilities" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.558836 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" containerName="extract-utilities" Dec 04 09:47:06 crc kubenswrapper[4693]: E1204 09:47:06.558848 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" containerName="extract-content" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.558857 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" containerName="extract-content" Dec 04 09:47:06 crc kubenswrapper[4693]: E1204 09:47:06.558870 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="091ad27f-8755-44ef-9961-1332f8f83860" containerName="controller-manager" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.558876 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="091ad27f-8755-44ef-9961-1332f8f83860" containerName="controller-manager" Dec 04 09:47:06 crc kubenswrapper[4693]: E1204 09:47:06.558884 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" containerName="extract-content" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.558889 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" containerName="extract-content" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.558981 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="de8930f8-8514-4179-a3b6-3408199d5cd8" containerName="registry-server" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.558991 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" containerName="registry-server" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.559000 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.559009 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="091ad27f-8755-44ef-9961-1332f8f83860" containerName="controller-manager" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.559398 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.563539 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56c45b45c8-jsztf"] Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.634167 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/655d8f7a-4856-4f54-88b0-f385028cac38-serving-cert\") pod \"controller-manager-56c45b45c8-jsztf\" (UID: \"655d8f7a-4856-4f54-88b0-f385028cac38\") " pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.634238 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/655d8f7a-4856-4f54-88b0-f385028cac38-config\") pod \"controller-manager-56c45b45c8-jsztf\" (UID: \"655d8f7a-4856-4f54-88b0-f385028cac38\") " pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.634385 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/655d8f7a-4856-4f54-88b0-f385028cac38-proxy-ca-bundles\") pod \"controller-manager-56c45b45c8-jsztf\" (UID: \"655d8f7a-4856-4f54-88b0-f385028cac38\") " pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.634471 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-899pc\" (UniqueName: \"kubernetes.io/projected/655d8f7a-4856-4f54-88b0-f385028cac38-kube-api-access-899pc\") pod \"controller-manager-56c45b45c8-jsztf\" (UID: \"655d8f7a-4856-4f54-88b0-f385028cac38\") " pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.634554 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/655d8f7a-4856-4f54-88b0-f385028cac38-client-ca\") pod \"controller-manager-56c45b45c8-jsztf\" (UID: \"655d8f7a-4856-4f54-88b0-f385028cac38\") " pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.735725 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/655d8f7a-4856-4f54-88b0-f385028cac38-client-ca\") pod \"controller-manager-56c45b45c8-jsztf\" (UID: \"655d8f7a-4856-4f54-88b0-f385028cac38\") " pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.735799 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/655d8f7a-4856-4f54-88b0-f385028cac38-serving-cert\") pod \"controller-manager-56c45b45c8-jsztf\" (UID: \"655d8f7a-4856-4f54-88b0-f385028cac38\") " pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.735834 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/655d8f7a-4856-4f54-88b0-f385028cac38-config\") pod \"controller-manager-56c45b45c8-jsztf\" (UID: \"655d8f7a-4856-4f54-88b0-f385028cac38\") " pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.735851 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/655d8f7a-4856-4f54-88b0-f385028cac38-proxy-ca-bundles\") pod \"controller-manager-56c45b45c8-jsztf\" (UID: \"655d8f7a-4856-4f54-88b0-f385028cac38\") " pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.735876 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-899pc\" (UniqueName: \"kubernetes.io/projected/655d8f7a-4856-4f54-88b0-f385028cac38-kube-api-access-899pc\") pod \"controller-manager-56c45b45c8-jsztf\" (UID: \"655d8f7a-4856-4f54-88b0-f385028cac38\") " pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.736935 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/655d8f7a-4856-4f54-88b0-f385028cac38-client-ca\") pod \"controller-manager-56c45b45c8-jsztf\" (UID: \"655d8f7a-4856-4f54-88b0-f385028cac38\") " pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.738181 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/655d8f7a-4856-4f54-88b0-f385028cac38-config\") pod \"controller-manager-56c45b45c8-jsztf\" (UID: \"655d8f7a-4856-4f54-88b0-f385028cac38\") " pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.738605 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/655d8f7a-4856-4f54-88b0-f385028cac38-proxy-ca-bundles\") pod \"controller-manager-56c45b45c8-jsztf\" (UID: \"655d8f7a-4856-4f54-88b0-f385028cac38\") " pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.740228 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/655d8f7a-4856-4f54-88b0-f385028cac38-serving-cert\") pod \"controller-manager-56c45b45c8-jsztf\" (UID: \"655d8f7a-4856-4f54-88b0-f385028cac38\") " pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.753481 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-899pc\" (UniqueName: \"kubernetes.io/projected/655d8f7a-4856-4f54-88b0-f385028cac38-kube-api-access-899pc\") pod \"controller-manager-56c45b45c8-jsztf\" (UID: \"655d8f7a-4856-4f54-88b0-f385028cac38\") " pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.766614 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7q9x" event={"ID":"6a3f510e-344a-4433-bfbb-ed6c76c3a4b5","Type":"ContainerDied","Data":"2c8820b01aeb9bc55a776386278d14b4ee51ab10cdf62db5737c7c53c0644022"} Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.766861 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7q9x" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.767181 4693 scope.go:117] "RemoveContainer" containerID="a71348ae53add427709b4a8def892e7afc554f50c8b53fe8790527ac68cae297" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.772922 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f894987c-29svd" event={"ID":"091ad27f-8755-44ef-9961-1332f8f83860","Type":"ContainerDied","Data":"7d883f2d9421411e68404619f33490e6949a61ab4568615b6c8b92a3f6b00dfa"} Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.772995 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f894987c-29svd" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.792543 4693 scope.go:117] "RemoveContainer" containerID="8a58ec9a1f30ead081ddd42026fe0495974cc6b81d86016b1e139a0e6c80bc40" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.805203 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f894987c-29svd"] Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.811614 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-f894987c-29svd"] Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.823152 4693 scope.go:117] "RemoveContainer" containerID="74211032aea3ab9631ae47bbcfc6896290d74c0d6e2951ff1edb5bc8497ee5a0" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.827827 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l7q9x"] Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.840049 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l7q9x"] Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.842646 4693 scope.go:117] "RemoveContainer" containerID="ad110ea895a1292b75a713cb9031256b6f5849c41585cbd4b38a768209840cd0" Dec 04 09:47:06 crc kubenswrapper[4693]: I1204 09:47:06.912650 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" Dec 04 09:47:07 crc kubenswrapper[4693]: I1204 09:47:07.093877 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56c45b45c8-jsztf"] Dec 04 09:47:07 crc kubenswrapper[4693]: W1204 09:47:07.105811 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod655d8f7a_4856_4f54_88b0_f385028cac38.slice/crio-7ef9f75963d970de3c92506d3dae7133c6755fe9ee2bf141f54a01996cba47d1 WatchSource:0}: Error finding container 7ef9f75963d970de3c92506d3dae7133c6755fe9ee2bf141f54a01996cba47d1: Status 404 returned error can't find the container with id 7ef9f75963d970de3c92506d3dae7133c6755fe9ee2bf141f54a01996cba47d1 Dec 04 09:47:07 crc kubenswrapper[4693]: I1204 09:47:07.118137 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sjp76"] Dec 04 09:47:07 crc kubenswrapper[4693]: I1204 09:47:07.118355 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sjp76" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" containerName="registry-server" containerID="cri-o://d1737e7eb20e352c0f12ab3f050b9edaaa02bff4600f9d351564ba6492f53087" gracePeriod=2 Dec 04 09:47:07 crc kubenswrapper[4693]: I1204 09:47:07.319438 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nnfsl"] Dec 04 09:47:07 crc kubenswrapper[4693]: I1204 09:47:07.319692 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nnfsl" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" containerName="registry-server" containerID="cri-o://671bfcf7ed0344ccc65f037f59c8e38257a1066ebd1d09eadfe63460f1c9fcae" gracePeriod=2 Dec 04 09:47:07 crc kubenswrapper[4693]: I1204 09:47:07.790441 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" event={"ID":"655d8f7a-4856-4f54-88b0-f385028cac38","Type":"ContainerStarted","Data":"7ef9f75963d970de3c92506d3dae7133c6755fe9ee2bf141f54a01996cba47d1"} Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.116505 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.469652 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="091ad27f-8755-44ef-9961-1332f8f83860" path="/var/lib/kubelet/pods/091ad27f-8755-44ef-9961-1332f8f83860/volumes" Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.470652 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a3f510e-344a-4433-bfbb-ed6c76c3a4b5" path="/var/lib/kubelet/pods/6a3f510e-344a-4433-bfbb-ed6c76c3a4b5/volumes" Dec 04 09:47:08 crc kubenswrapper[4693]: E1204 09:47:08.606002 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 671bfcf7ed0344ccc65f037f59c8e38257a1066ebd1d09eadfe63460f1c9fcae is running failed: container process not found" containerID="671bfcf7ed0344ccc65f037f59c8e38257a1066ebd1d09eadfe63460f1c9fcae" cmd=["grpc_health_probe","-addr=:50051"] Dec 04 09:47:08 crc kubenswrapper[4693]: E1204 09:47:08.606800 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 671bfcf7ed0344ccc65f037f59c8e38257a1066ebd1d09eadfe63460f1c9fcae is running failed: container process not found" containerID="671bfcf7ed0344ccc65f037f59c8e38257a1066ebd1d09eadfe63460f1c9fcae" cmd=["grpc_health_probe","-addr=:50051"] Dec 04 09:47:08 crc kubenswrapper[4693]: E1204 09:47:08.608553 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 671bfcf7ed0344ccc65f037f59c8e38257a1066ebd1d09eadfe63460f1c9fcae is running failed: container process not found" containerID="671bfcf7ed0344ccc65f037f59c8e38257a1066ebd1d09eadfe63460f1c9fcae" cmd=["grpc_health_probe","-addr=:50051"] Dec 04 09:47:08 crc kubenswrapper[4693]: E1204 09:47:08.608656 4693 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 671bfcf7ed0344ccc65f037f59c8e38257a1066ebd1d09eadfe63460f1c9fcae is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-nnfsl" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" containerName="registry-server" Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.767905 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sjp76" Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.801171 4693 generic.go:334] "Generic (PLEG): container finished" podID="ac8a0ee1-b340-421c-8496-74757e180a20" containerID="671bfcf7ed0344ccc65f037f59c8e38257a1066ebd1d09eadfe63460f1c9fcae" exitCode=0 Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.801245 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnfsl" event={"ID":"ac8a0ee1-b340-421c-8496-74757e180a20","Type":"ContainerDied","Data":"671bfcf7ed0344ccc65f037f59c8e38257a1066ebd1d09eadfe63460f1c9fcae"} Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.803625 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" event={"ID":"655d8f7a-4856-4f54-88b0-f385028cac38","Type":"ContainerStarted","Data":"d5ebd6010aebee405798ac85c53d4b8f89d4097da7b185c1e46eea60ad6483d2"} Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.803812 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.810282 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.874718 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f-catalog-content\") pod \"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f\" (UID: \"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f\") " Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.874784 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f-utilities\") pod \"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f\" (UID: \"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f\") " Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.874810 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx856\" (UniqueName: \"kubernetes.io/projected/aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f-kube-api-access-bx856\") pod \"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f\" (UID: \"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f\") " Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.876280 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f-utilities" (OuterVolumeSpecName: "utilities") pod "aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" (UID: "aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.886617 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f-kube-api-access-bx856" (OuterVolumeSpecName: "kube-api-access-bx856") pod "aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" (UID: "aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f"). InnerVolumeSpecName "kube-api-access-bx856". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.896898 4693 generic.go:334] "Generic (PLEG): container finished" podID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" containerID="d1737e7eb20e352c0f12ab3f050b9edaaa02bff4600f9d351564ba6492f53087" exitCode=0 Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.896972 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjp76" event={"ID":"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f","Type":"ContainerDied","Data":"d1737e7eb20e352c0f12ab3f050b9edaaa02bff4600f9d351564ba6492f53087"} Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.897021 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjp76" event={"ID":"aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f","Type":"ContainerDied","Data":"8a85b51cff4f4721346609bfd5619a22eba5b03bc2aa0b1d7e316a0116a6c7a7"} Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.897041 4693 scope.go:117] "RemoveContainer" containerID="d1737e7eb20e352c0f12ab3f050b9edaaa02bff4600f9d351564ba6492f53087" Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.897197 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sjp76" Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.909696 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-56c45b45c8-jsztf" podStartSLOduration=3.90967533 podStartE2EDuration="3.90967533s" podCreationTimestamp="2025-12-04 09:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:47:08.907668028 +0000 UTC m=+274.805261791" watchObservedRunningTime="2025-12-04 09:47:08.90967533 +0000 UTC m=+274.807269083" Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.921783 4693 scope.go:117] "RemoveContainer" containerID="5c8dee3ec09fd90e28c4aa8718af4b68356af5f16b59064bf0beb0fe3db0ca70" Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.954504 4693 scope.go:117] "RemoveContainer" containerID="5bb191a79397e5aecc102a47f1eac42fb9cfb8303b765dbed3aa83562bccb61a" Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.977001 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.977033 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx856\" (UniqueName: \"kubernetes.io/projected/aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f-kube-api-access-bx856\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:08 crc kubenswrapper[4693]: I1204 09:47:08.992436 4693 scope.go:117] "RemoveContainer" containerID="d1737e7eb20e352c0f12ab3f050b9edaaa02bff4600f9d351564ba6492f53087" Dec 04 09:47:09 crc kubenswrapper[4693]: E1204 09:47:09.003615 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1737e7eb20e352c0f12ab3f050b9edaaa02bff4600f9d351564ba6492f53087\": container with ID starting with d1737e7eb20e352c0f12ab3f050b9edaaa02bff4600f9d351564ba6492f53087 not found: ID does not exist" containerID="d1737e7eb20e352c0f12ab3f050b9edaaa02bff4600f9d351564ba6492f53087" Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.003663 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1737e7eb20e352c0f12ab3f050b9edaaa02bff4600f9d351564ba6492f53087"} err="failed to get container status \"d1737e7eb20e352c0f12ab3f050b9edaaa02bff4600f9d351564ba6492f53087\": rpc error: code = NotFound desc = could not find container \"d1737e7eb20e352c0f12ab3f050b9edaaa02bff4600f9d351564ba6492f53087\": container with ID starting with d1737e7eb20e352c0f12ab3f050b9edaaa02bff4600f9d351564ba6492f53087 not found: ID does not exist" Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.003692 4693 scope.go:117] "RemoveContainer" containerID="5c8dee3ec09fd90e28c4aa8718af4b68356af5f16b59064bf0beb0fe3db0ca70" Dec 04 09:47:09 crc kubenswrapper[4693]: E1204 09:47:09.004678 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c8dee3ec09fd90e28c4aa8718af4b68356af5f16b59064bf0beb0fe3db0ca70\": container with ID starting with 5c8dee3ec09fd90e28c4aa8718af4b68356af5f16b59064bf0beb0fe3db0ca70 not found: ID does not exist" containerID="5c8dee3ec09fd90e28c4aa8718af4b68356af5f16b59064bf0beb0fe3db0ca70" Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.004723 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c8dee3ec09fd90e28c4aa8718af4b68356af5f16b59064bf0beb0fe3db0ca70"} err="failed to get container status \"5c8dee3ec09fd90e28c4aa8718af4b68356af5f16b59064bf0beb0fe3db0ca70\": rpc error: code = NotFound desc = could not find container \"5c8dee3ec09fd90e28c4aa8718af4b68356af5f16b59064bf0beb0fe3db0ca70\": container with ID starting with 5c8dee3ec09fd90e28c4aa8718af4b68356af5f16b59064bf0beb0fe3db0ca70 not found: ID does not exist" Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.004751 4693 scope.go:117] "RemoveContainer" containerID="5bb191a79397e5aecc102a47f1eac42fb9cfb8303b765dbed3aa83562bccb61a" Dec 04 09:47:09 crc kubenswrapper[4693]: E1204 09:47:09.007730 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bb191a79397e5aecc102a47f1eac42fb9cfb8303b765dbed3aa83562bccb61a\": container with ID starting with 5bb191a79397e5aecc102a47f1eac42fb9cfb8303b765dbed3aa83562bccb61a not found: ID does not exist" containerID="5bb191a79397e5aecc102a47f1eac42fb9cfb8303b765dbed3aa83562bccb61a" Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.007764 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bb191a79397e5aecc102a47f1eac42fb9cfb8303b765dbed3aa83562bccb61a"} err="failed to get container status \"5bb191a79397e5aecc102a47f1eac42fb9cfb8303b765dbed3aa83562bccb61a\": rpc error: code = NotFound desc = could not find container \"5bb191a79397e5aecc102a47f1eac42fb9cfb8303b765dbed3aa83562bccb61a\": container with ID starting with 5bb191a79397e5aecc102a47f1eac42fb9cfb8303b765dbed3aa83562bccb61a not found: ID does not exist" Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.051172 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" (UID: "aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.078456 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.141504 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnfsl" Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.179259 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8a0ee1-b340-421c-8496-74757e180a20-utilities\") pod \"ac8a0ee1-b340-421c-8496-74757e180a20\" (UID: \"ac8a0ee1-b340-421c-8496-74757e180a20\") " Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.179366 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8a0ee1-b340-421c-8496-74757e180a20-catalog-content\") pod \"ac8a0ee1-b340-421c-8496-74757e180a20\" (UID: \"ac8a0ee1-b340-421c-8496-74757e180a20\") " Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.179432 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99788\" (UniqueName: \"kubernetes.io/projected/ac8a0ee1-b340-421c-8496-74757e180a20-kube-api-access-99788\") pod \"ac8a0ee1-b340-421c-8496-74757e180a20\" (UID: \"ac8a0ee1-b340-421c-8496-74757e180a20\") " Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.180074 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac8a0ee1-b340-421c-8496-74757e180a20-utilities" (OuterVolumeSpecName: "utilities") pod "ac8a0ee1-b340-421c-8496-74757e180a20" (UID: "ac8a0ee1-b340-421c-8496-74757e180a20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.186512 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac8a0ee1-b340-421c-8496-74757e180a20-kube-api-access-99788" (OuterVolumeSpecName: "kube-api-access-99788") pod "ac8a0ee1-b340-421c-8496-74757e180a20" (UID: "ac8a0ee1-b340-421c-8496-74757e180a20"). InnerVolumeSpecName "kube-api-access-99788". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.232522 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sjp76"] Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.236315 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sjp76"] Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.240764 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac8a0ee1-b340-421c-8496-74757e180a20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac8a0ee1-b340-421c-8496-74757e180a20" (UID: "ac8a0ee1-b340-421c-8496-74757e180a20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.280664 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8a0ee1-b340-421c-8496-74757e180a20-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.280986 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99788\" (UniqueName: \"kubernetes.io/projected/ac8a0ee1-b340-421c-8496-74757e180a20-kube-api-access-99788\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.281065 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8a0ee1-b340-421c-8496-74757e180a20-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.907864 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnfsl" Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.910439 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnfsl" event={"ID":"ac8a0ee1-b340-421c-8496-74757e180a20","Type":"ContainerDied","Data":"2ae632fe1c4f4979c0c949743ec3a04a2eb9492d80367d94757a2a8159031e88"} Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.910515 4693 scope.go:117] "RemoveContainer" containerID="671bfcf7ed0344ccc65f037f59c8e38257a1066ebd1d09eadfe63460f1c9fcae" Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.930001 4693 scope.go:117] "RemoveContainer" containerID="e139e343accd1785464899b87f152c13d96a827cda91a862accaad2209d96dc5" Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.935843 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nnfsl"] Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.945278 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nnfsl"] Dec 04 09:47:09 crc kubenswrapper[4693]: I1204 09:47:09.957782 4693 scope.go:117] "RemoveContainer" containerID="ed5795b6e212f2841ff5cdc30fff5e5902df2537d91390ec4e25ef4164d3ac9d" Dec 04 09:47:10 crc kubenswrapper[4693]: I1204 09:47:10.469596 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" path="/var/lib/kubelet/pods/ac8a0ee1-b340-421c-8496-74757e180a20/volumes" Dec 04 09:47:10 crc kubenswrapper[4693]: I1204 09:47:10.470382 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" path="/var/lib/kubelet/pods/aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f/volumes" Dec 04 09:47:10 crc kubenswrapper[4693]: I1204 09:47:10.961691 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" podUID="764c0924-2f3b-4341-9922-a22d2f3cf145" containerName="oauth-openshift" containerID="cri-o://8cad583a494e5ac78d0833e3cc1cb17cd46d567d0f16c1fddeed6592200ae00b" gracePeriod=15 Dec 04 09:47:11 crc kubenswrapper[4693]: I1204 09:47:11.889737 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:47:11 crc kubenswrapper[4693]: I1204 09:47:11.922439 4693 generic.go:334] "Generic (PLEG): container finished" podID="764c0924-2f3b-4341-9922-a22d2f3cf145" containerID="8cad583a494e5ac78d0833e3cc1cb17cd46d567d0f16c1fddeed6592200ae00b" exitCode=0 Dec 04 09:47:11 crc kubenswrapper[4693]: I1204 09:47:11.922498 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" event={"ID":"764c0924-2f3b-4341-9922-a22d2f3cf145","Type":"ContainerDied","Data":"8cad583a494e5ac78d0833e3cc1cb17cd46d567d0f16c1fddeed6592200ae00b"} Dec 04 09:47:11 crc kubenswrapper[4693]: I1204 09:47:11.922533 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" event={"ID":"764c0924-2f3b-4341-9922-a22d2f3cf145","Type":"ContainerDied","Data":"9cfef2b33f61e61ae0e8a757a9418b45119913a0a50464f3eea21c5888ae754a"} Dec 04 09:47:11 crc kubenswrapper[4693]: I1204 09:47:11.922552 4693 scope.go:117] "RemoveContainer" containerID="8cad583a494e5ac78d0833e3cc1cb17cd46d567d0f16c1fddeed6592200ae00b" Dec 04 09:47:11 crc kubenswrapper[4693]: I1204 09:47:11.922681 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jznlz" Dec 04 09:47:11 crc kubenswrapper[4693]: I1204 09:47:11.940356 4693 scope.go:117] "RemoveContainer" containerID="8cad583a494e5ac78d0833e3cc1cb17cd46d567d0f16c1fddeed6592200ae00b" Dec 04 09:47:11 crc kubenswrapper[4693]: E1204 09:47:11.940879 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cad583a494e5ac78d0833e3cc1cb17cd46d567d0f16c1fddeed6592200ae00b\": container with ID starting with 8cad583a494e5ac78d0833e3cc1cb17cd46d567d0f16c1fddeed6592200ae00b not found: ID does not exist" containerID="8cad583a494e5ac78d0833e3cc1cb17cd46d567d0f16c1fddeed6592200ae00b" Dec 04 09:47:11 crc kubenswrapper[4693]: I1204 09:47:11.941490 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cad583a494e5ac78d0833e3cc1cb17cd46d567d0f16c1fddeed6592200ae00b"} err="failed to get container status \"8cad583a494e5ac78d0833e3cc1cb17cd46d567d0f16c1fddeed6592200ae00b\": rpc error: code = NotFound desc = could not find container \"8cad583a494e5ac78d0833e3cc1cb17cd46d567d0f16c1fddeed6592200ae00b\": container with ID starting with 8cad583a494e5ac78d0833e3cc1cb17cd46d567d0f16c1fddeed6592200ae00b not found: ID does not exist" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.019524 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-trusted-ca-bundle\") pod \"764c0924-2f3b-4341-9922-a22d2f3cf145\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.019625 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-template-error\") pod \"764c0924-2f3b-4341-9922-a22d2f3cf145\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.019665 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-service-ca\") pod \"764c0924-2f3b-4341-9922-a22d2f3cf145\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.019686 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-template-login\") pod \"764c0924-2f3b-4341-9922-a22d2f3cf145\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.019728 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-ocp-branding-template\") pod \"764c0924-2f3b-4341-9922-a22d2f3cf145\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.019760 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-idp-0-file-data\") pod \"764c0924-2f3b-4341-9922-a22d2f3cf145\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.019792 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngdqf\" (UniqueName: \"kubernetes.io/projected/764c0924-2f3b-4341-9922-a22d2f3cf145-kube-api-access-ngdqf\") pod \"764c0924-2f3b-4341-9922-a22d2f3cf145\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.019823 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-audit-policies\") pod \"764c0924-2f3b-4341-9922-a22d2f3cf145\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.019852 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-router-certs\") pod \"764c0924-2f3b-4341-9922-a22d2f3cf145\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.019876 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-template-provider-selection\") pod \"764c0924-2f3b-4341-9922-a22d2f3cf145\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.019899 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/764c0924-2f3b-4341-9922-a22d2f3cf145-audit-dir\") pod \"764c0924-2f3b-4341-9922-a22d2f3cf145\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.019927 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-serving-cert\") pod \"764c0924-2f3b-4341-9922-a22d2f3cf145\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.019990 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-cliconfig\") pod \"764c0924-2f3b-4341-9922-a22d2f3cf145\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.020019 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-session\") pod \"764c0924-2f3b-4341-9922-a22d2f3cf145\" (UID: \"764c0924-2f3b-4341-9922-a22d2f3cf145\") " Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.020357 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/764c0924-2f3b-4341-9922-a22d2f3cf145-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "764c0924-2f3b-4341-9922-a22d2f3cf145" (UID: "764c0924-2f3b-4341-9922-a22d2f3cf145"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.021025 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "764c0924-2f3b-4341-9922-a22d2f3cf145" (UID: "764c0924-2f3b-4341-9922-a22d2f3cf145"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.021115 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "764c0924-2f3b-4341-9922-a22d2f3cf145" (UID: "764c0924-2f3b-4341-9922-a22d2f3cf145"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.021279 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "764c0924-2f3b-4341-9922-a22d2f3cf145" (UID: "764c0924-2f3b-4341-9922-a22d2f3cf145"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.021474 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "764c0924-2f3b-4341-9922-a22d2f3cf145" (UID: "764c0924-2f3b-4341-9922-a22d2f3cf145"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.025161 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "764c0924-2f3b-4341-9922-a22d2f3cf145" (UID: "764c0924-2f3b-4341-9922-a22d2f3cf145"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.025775 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "764c0924-2f3b-4341-9922-a22d2f3cf145" (UID: "764c0924-2f3b-4341-9922-a22d2f3cf145"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.029019 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "764c0924-2f3b-4341-9922-a22d2f3cf145" (UID: "764c0924-2f3b-4341-9922-a22d2f3cf145"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.029162 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/764c0924-2f3b-4341-9922-a22d2f3cf145-kube-api-access-ngdqf" (OuterVolumeSpecName: "kube-api-access-ngdqf") pod "764c0924-2f3b-4341-9922-a22d2f3cf145" (UID: "764c0924-2f3b-4341-9922-a22d2f3cf145"). InnerVolumeSpecName "kube-api-access-ngdqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.029299 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "764c0924-2f3b-4341-9922-a22d2f3cf145" (UID: "764c0924-2f3b-4341-9922-a22d2f3cf145"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.029736 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "764c0924-2f3b-4341-9922-a22d2f3cf145" (UID: "764c0924-2f3b-4341-9922-a22d2f3cf145"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.034625 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "764c0924-2f3b-4341-9922-a22d2f3cf145" (UID: "764c0924-2f3b-4341-9922-a22d2f3cf145"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.034835 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "764c0924-2f3b-4341-9922-a22d2f3cf145" (UID: "764c0924-2f3b-4341-9922-a22d2f3cf145"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.035357 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "764c0924-2f3b-4341-9922-a22d2f3cf145" (UID: "764c0924-2f3b-4341-9922-a22d2f3cf145"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.120940 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.120989 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.121005 4693 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/764c0924-2f3b-4341-9922-a22d2f3cf145-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.121015 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.121026 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.121035 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.121045 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.121054 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.121063 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.121073 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.121098 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.121107 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/764c0924-2f3b-4341-9922-a22d2f3cf145-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.121116 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngdqf\" (UniqueName: \"kubernetes.io/projected/764c0924-2f3b-4341-9922-a22d2f3cf145-kube-api-access-ngdqf\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.121126 4693 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/764c0924-2f3b-4341-9922-a22d2f3cf145-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.256209 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jznlz"] Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.261033 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jznlz"] Dec 04 09:47:12 crc kubenswrapper[4693]: I1204 09:47:12.467532 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="764c0924-2f3b-4341-9922-a22d2f3cf145" path="/var/lib/kubelet/pods/764c0924-2f3b-4341-9922-a22d2f3cf145/volumes" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.567313 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6b8b86856-4fv4p"] Dec 04 09:47:18 crc kubenswrapper[4693]: E1204 09:47:18.568442 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" containerName="extract-utilities" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.568470 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" containerName="extract-utilities" Dec 04 09:47:18 crc kubenswrapper[4693]: E1204 09:47:18.568490 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" containerName="registry-server" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.568510 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" containerName="registry-server" Dec 04 09:47:18 crc kubenswrapper[4693]: E1204 09:47:18.568538 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" containerName="extract-content" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.568554 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" containerName="extract-content" Dec 04 09:47:18 crc kubenswrapper[4693]: E1204 09:47:18.568572 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" containerName="extract-utilities" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.568588 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" containerName="extract-utilities" Dec 04 09:47:18 crc kubenswrapper[4693]: E1204 09:47:18.568608 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="764c0924-2f3b-4341-9922-a22d2f3cf145" containerName="oauth-openshift" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.568624 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="764c0924-2f3b-4341-9922-a22d2f3cf145" containerName="oauth-openshift" Dec 04 09:47:18 crc kubenswrapper[4693]: E1204 09:47:18.568665 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" containerName="registry-server" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.568680 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" containerName="registry-server" Dec 04 09:47:18 crc kubenswrapper[4693]: E1204 09:47:18.568696 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" containerName="extract-content" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.568713 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" containerName="extract-content" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.568929 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="764c0924-2f3b-4341-9922-a22d2f3cf145" containerName="oauth-openshift" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.568956 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee4fe2f-0b17-4ff0-9f0c-9c7ae6b8ad6f" containerName="registry-server" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.568974 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac8a0ee1-b340-421c-8496-74757e180a20" containerName="registry-server" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.569725 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.578078 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.578540 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.578565 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.578259 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.578362 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.578323 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.578377 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.578410 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.578481 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.578509 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.578517 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.580635 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.582113 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6b8b86856-4fv4p"] Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.585167 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.592654 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.599091 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.698295 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1edde0c0-597e-4e68-ab91-3d81904f5bad-audit-dir\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.698415 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.698465 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.698499 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftdbz\" (UniqueName: \"kubernetes.io/projected/1edde0c0-597e-4e68-ab91-3d81904f5bad-kube-api-access-ftdbz\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.698526 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-user-template-login\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.698548 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.698570 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.698648 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-session\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.698716 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.698744 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1edde0c0-597e-4e68-ab91-3d81904f5bad-audit-policies\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.698761 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.698781 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-user-template-error\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.698959 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.699003 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.799895 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.799937 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.799977 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1edde0c0-597e-4e68-ab91-3d81904f5bad-audit-dir\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.799994 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.800020 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.800051 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftdbz\" (UniqueName: \"kubernetes.io/projected/1edde0c0-597e-4e68-ab91-3d81904f5bad-kube-api-access-ftdbz\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.800073 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-user-template-login\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.800100 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.800120 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-session\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.800121 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1edde0c0-597e-4e68-ab91-3d81904f5bad-audit-dir\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.800140 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.800210 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.800237 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1edde0c0-597e-4e68-ab91-3d81904f5bad-audit-policies\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.800253 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.800291 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-user-template-error\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.801363 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1edde0c0-597e-4e68-ab91-3d81904f5bad-audit-policies\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.801467 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.801681 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.801849 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.806278 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.806539 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.806826 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-user-template-login\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.807831 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.807946 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-session\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.808075 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.808110 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.810399 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1edde0c0-597e-4e68-ab91-3d81904f5bad-v4-0-config-user-template-error\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.820384 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftdbz\" (UniqueName: \"kubernetes.io/projected/1edde0c0-597e-4e68-ab91-3d81904f5bad-kube-api-access-ftdbz\") pod \"oauth-openshift-6b8b86856-4fv4p\" (UID: \"1edde0c0-597e-4e68-ab91-3d81904f5bad\") " pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:18 crc kubenswrapper[4693]: I1204 09:47:18.899303 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:19 crc kubenswrapper[4693]: I1204 09:47:19.338472 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6b8b86856-4fv4p"] Dec 04 09:47:19 crc kubenswrapper[4693]: I1204 09:47:19.991393 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" event={"ID":"1edde0c0-597e-4e68-ab91-3d81904f5bad","Type":"ContainerStarted","Data":"6d0de6779a2da21ba64a9089cbdf63c80aebf4df6d34941a1cc9e9b68d92a023"} Dec 04 09:47:19 crc kubenswrapper[4693]: I1204 09:47:19.991772 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" event={"ID":"1edde0c0-597e-4e68-ab91-3d81904f5bad","Type":"ContainerStarted","Data":"32c5f505b8043cc9817f6c484afe9bd7c5745a82beabd16f6b88125596b5993f"} Dec 04 09:47:19 crc kubenswrapper[4693]: I1204 09:47:19.992126 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:20 crc kubenswrapper[4693]: I1204 09:47:20.017248 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" podStartSLOduration=35.017231334 podStartE2EDuration="35.017231334s" podCreationTimestamp="2025-12-04 09:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:47:20.012489189 +0000 UTC m=+285.910082962" watchObservedRunningTime="2025-12-04 09:47:20.017231334 +0000 UTC m=+285.914825087" Dec 04 09:47:20 crc kubenswrapper[4693]: I1204 09:47:20.355455 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6b8b86856-4fv4p" Dec 04 09:47:34 crc kubenswrapper[4693]: I1204 09:47:34.226460 4693 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.704358 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gww92"] Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.705836 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.716225 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gww92"] Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.779712 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88767b83-0d11-46f7-b46e-c2c6d02634cd-bound-sa-token\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.779758 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/88767b83-0d11-46f7-b46e-c2c6d02634cd-trusted-ca\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.779795 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/88767b83-0d11-46f7-b46e-c2c6d02634cd-registry-tls\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.779833 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.779861 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clx96\" (UniqueName: \"kubernetes.io/projected/88767b83-0d11-46f7-b46e-c2c6d02634cd-kube-api-access-clx96\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.779903 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/88767b83-0d11-46f7-b46e-c2c6d02634cd-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.779922 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/88767b83-0d11-46f7-b46e-c2c6d02634cd-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.780112 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/88767b83-0d11-46f7-b46e-c2c6d02634cd-registry-certificates\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.799252 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.880842 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clx96\" (UniqueName: \"kubernetes.io/projected/88767b83-0d11-46f7-b46e-c2c6d02634cd-kube-api-access-clx96\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.880894 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/88767b83-0d11-46f7-b46e-c2c6d02634cd-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.880911 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/88767b83-0d11-46f7-b46e-c2c6d02634cd-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.880950 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/88767b83-0d11-46f7-b46e-c2c6d02634cd-registry-certificates\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.880991 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88767b83-0d11-46f7-b46e-c2c6d02634cd-bound-sa-token\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.881013 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/88767b83-0d11-46f7-b46e-c2c6d02634cd-trusted-ca\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.881051 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/88767b83-0d11-46f7-b46e-c2c6d02634cd-registry-tls\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.881539 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/88767b83-0d11-46f7-b46e-c2c6d02634cd-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.882373 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/88767b83-0d11-46f7-b46e-c2c6d02634cd-registry-certificates\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.882499 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/88767b83-0d11-46f7-b46e-c2c6d02634cd-trusted-ca\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.886616 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/88767b83-0d11-46f7-b46e-c2c6d02634cd-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.886658 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/88767b83-0d11-46f7-b46e-c2c6d02634cd-registry-tls\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.900355 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clx96\" (UniqueName: \"kubernetes.io/projected/88767b83-0d11-46f7-b46e-c2c6d02634cd-kube-api-access-clx96\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:39 crc kubenswrapper[4693]: I1204 09:47:39.902475 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88767b83-0d11-46f7-b46e-c2c6d02634cd-bound-sa-token\") pod \"image-registry-66df7c8f76-gww92\" (UID: \"88767b83-0d11-46f7-b46e-c2c6d02634cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:40 crc kubenswrapper[4693]: I1204 09:47:40.020457 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:40 crc kubenswrapper[4693]: I1204 09:47:40.417802 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gww92"] Dec 04 09:47:41 crc kubenswrapper[4693]: I1204 09:47:41.134112 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-gww92" event={"ID":"88767b83-0d11-46f7-b46e-c2c6d02634cd","Type":"ContainerStarted","Data":"5453111e8291d3a6fc6fb0b9d88ae048b484857ee7b3abdf650204ca12164f3f"} Dec 04 09:47:42 crc kubenswrapper[4693]: I1204 09:47:42.143611 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-gww92" event={"ID":"88767b83-0d11-46f7-b46e-c2c6d02634cd","Type":"ContainerStarted","Data":"e42876010efbca16ba75628012062f04dfdaddc741a1d72e90b641ef7774e54e"} Dec 04 09:47:42 crc kubenswrapper[4693]: I1204 09:47:42.143897 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:47:42 crc kubenswrapper[4693]: I1204 09:47:42.164731 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-gww92" podStartSLOduration=3.164712583 podStartE2EDuration="3.164712583s" podCreationTimestamp="2025-12-04 09:47:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:47:42.163035391 +0000 UTC m=+308.060629144" watchObservedRunningTime="2025-12-04 09:47:42.164712583 +0000 UTC m=+308.062306336" Dec 04 09:48:00 crc kubenswrapper[4693]: I1204 09:48:00.026945 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-gww92" Dec 04 09:48:00 crc kubenswrapper[4693]: I1204 09:48:00.075746 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mnzz8"] Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.545525 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5rl2n"] Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.551194 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5rl2n" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" containerName="registry-server" containerID="cri-o://18de67ff61d1c233beed5d267b7a8bb8263a342eafc1750e3f3427561c5d0ea0" gracePeriod=30 Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.553424 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c7zjm"] Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.553757 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c7zjm" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" containerName="registry-server" containerID="cri-o://222f992a57089b16283cd25822fb7fda97ee625ca48ce0858a85b2516afb98f1" gracePeriod=30 Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.557971 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jt5h7"] Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.558225 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" podUID="29687875-23eb-403d-a89f-eb4d32092d7e" containerName="marketplace-operator" containerID="cri-o://5ea074f2e9de7abe08b01bac6fcba3e76c3c10e1c958067b46d7e6459f8f9d25" gracePeriod=30 Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.580636 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2d6g4"] Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.580922 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2d6g4" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" containerName="registry-server" containerID="cri-o://9635674fc4bf52b71ddfa0cf7ed55d8004b61d30106030e9db26618f8e381f86" gracePeriod=30 Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.597755 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-26frc"] Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.599652 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-26frc" Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.618192 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wgqzj"] Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.618562 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wgqzj" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" containerName="registry-server" containerID="cri-o://ad9ccae979860da0683251b0407431fb69c858888d1b76eaf3e620a02ea26e16" gracePeriod=30 Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.629342 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-26frc"] Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.659852 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/57e8fd24-01fe-42d0-9bd6-6066003c724b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-26frc\" (UID: \"57e8fd24-01fe-42d0-9bd6-6066003c724b\") " pod="openshift-marketplace/marketplace-operator-79b997595-26frc" Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.660162 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57e8fd24-01fe-42d0-9bd6-6066003c724b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-26frc\" (UID: \"57e8fd24-01fe-42d0-9bd6-6066003c724b\") " pod="openshift-marketplace/marketplace-operator-79b997595-26frc" Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.660297 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vm7j\" (UniqueName: \"kubernetes.io/projected/57e8fd24-01fe-42d0-9bd6-6066003c724b-kube-api-access-6vm7j\") pod \"marketplace-operator-79b997595-26frc\" (UID: \"57e8fd24-01fe-42d0-9bd6-6066003c724b\") " pod="openshift-marketplace/marketplace-operator-79b997595-26frc" Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.762388 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/57e8fd24-01fe-42d0-9bd6-6066003c724b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-26frc\" (UID: \"57e8fd24-01fe-42d0-9bd6-6066003c724b\") " pod="openshift-marketplace/marketplace-operator-79b997595-26frc" Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.762734 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57e8fd24-01fe-42d0-9bd6-6066003c724b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-26frc\" (UID: \"57e8fd24-01fe-42d0-9bd6-6066003c724b\") " pod="openshift-marketplace/marketplace-operator-79b997595-26frc" Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.762838 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vm7j\" (UniqueName: \"kubernetes.io/projected/57e8fd24-01fe-42d0-9bd6-6066003c724b-kube-api-access-6vm7j\") pod \"marketplace-operator-79b997595-26frc\" (UID: \"57e8fd24-01fe-42d0-9bd6-6066003c724b\") " pod="openshift-marketplace/marketplace-operator-79b997595-26frc" Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.763943 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57e8fd24-01fe-42d0-9bd6-6066003c724b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-26frc\" (UID: \"57e8fd24-01fe-42d0-9bd6-6066003c724b\") " pod="openshift-marketplace/marketplace-operator-79b997595-26frc" Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.774290 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/57e8fd24-01fe-42d0-9bd6-6066003c724b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-26frc\" (UID: \"57e8fd24-01fe-42d0-9bd6-6066003c724b\") " pod="openshift-marketplace/marketplace-operator-79b997595-26frc" Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.780248 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vm7j\" (UniqueName: \"kubernetes.io/projected/57e8fd24-01fe-42d0-9bd6-6066003c724b-kube-api-access-6vm7j\") pod \"marketplace-operator-79b997595-26frc\" (UID: \"57e8fd24-01fe-42d0-9bd6-6066003c724b\") " pod="openshift-marketplace/marketplace-operator-79b997595-26frc" Dec 04 09:48:05 crc kubenswrapper[4693]: I1204 09:48:05.920965 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-26frc" Dec 04 09:48:06 crc kubenswrapper[4693]: I1204 09:48:06.343746 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-26frc"] Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.279012 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-26frc" event={"ID":"57e8fd24-01fe-42d0-9bd6-6066003c724b","Type":"ContainerStarted","Data":"b8ebf07756367fa1aa1baefdd5f3b75d431afddbae879b96673f5769ac54fd3b"} Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.279401 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-26frc" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.279420 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-26frc" event={"ID":"57e8fd24-01fe-42d0-9bd6-6066003c724b","Type":"ContainerStarted","Data":"b8ee3bb6bf6faa6c78aaa76ad41160724d400d9ec9507e7fbda29296e81996f5"} Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.284613 4693 generic.go:334] "Generic (PLEG): container finished" podID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" containerID="222f992a57089b16283cd25822fb7fda97ee625ca48ce0858a85b2516afb98f1" exitCode=0 Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.284718 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7zjm" event={"ID":"a7c911e8-e91f-4c2d-9c38-d6bce33a819f","Type":"ContainerDied","Data":"222f992a57089b16283cd25822fb7fda97ee625ca48ce0858a85b2516afb98f1"} Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.299205 4693 generic.go:334] "Generic (PLEG): container finished" podID="29687875-23eb-403d-a89f-eb4d32092d7e" containerID="5ea074f2e9de7abe08b01bac6fcba3e76c3c10e1c958067b46d7e6459f8f9d25" exitCode=0 Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.304941 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-26frc" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.304997 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" event={"ID":"29687875-23eb-403d-a89f-eb4d32092d7e","Type":"ContainerDied","Data":"5ea074f2e9de7abe08b01bac6fcba3e76c3c10e1c958067b46d7e6459f8f9d25"} Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.306675 4693 generic.go:334] "Generic (PLEG): container finished" podID="7070356d-e89a-4f1c-a247-051bf520ae02" containerID="9635674fc4bf52b71ddfa0cf7ed55d8004b61d30106030e9db26618f8e381f86" exitCode=0 Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.306834 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d6g4" event={"ID":"7070356d-e89a-4f1c-a247-051bf520ae02","Type":"ContainerDied","Data":"9635674fc4bf52b71ddfa0cf7ed55d8004b61d30106030e9db26618f8e381f86"} Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.310961 4693 generic.go:334] "Generic (PLEG): container finished" podID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" containerID="ad9ccae979860da0683251b0407431fb69c858888d1b76eaf3e620a02ea26e16" exitCode=0 Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.311031 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgqzj" event={"ID":"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a","Type":"ContainerDied","Data":"ad9ccae979860da0683251b0407431fb69c858888d1b76eaf3e620a02ea26e16"} Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.313479 4693 generic.go:334] "Generic (PLEG): container finished" podID="2bedbed9-581a-414f-a92c-fc4933fcac93" containerID="18de67ff61d1c233beed5d267b7a8bb8263a342eafc1750e3f3427561c5d0ea0" exitCode=0 Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.313510 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rl2n" event={"ID":"2bedbed9-581a-414f-a92c-fc4933fcac93","Type":"ContainerDied","Data":"18de67ff61d1c233beed5d267b7a8bb8263a342eafc1750e3f3427561c5d0ea0"} Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.314719 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-26frc" podStartSLOduration=2.314697031 podStartE2EDuration="2.314697031s" podCreationTimestamp="2025-12-04 09:48:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:48:07.303476627 +0000 UTC m=+333.201070380" watchObservedRunningTime="2025-12-04 09:48:07.314697031 +0000 UTC m=+333.212290784" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.375441 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c7zjm" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.484854 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c911e8-e91f-4c2d-9c38-d6bce33a819f-utilities\") pod \"a7c911e8-e91f-4c2d-9c38-d6bce33a819f\" (UID: \"a7c911e8-e91f-4c2d-9c38-d6bce33a819f\") " Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.484957 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssh2h\" (UniqueName: \"kubernetes.io/projected/a7c911e8-e91f-4c2d-9c38-d6bce33a819f-kube-api-access-ssh2h\") pod \"a7c911e8-e91f-4c2d-9c38-d6bce33a819f\" (UID: \"a7c911e8-e91f-4c2d-9c38-d6bce33a819f\") " Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.485047 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c911e8-e91f-4c2d-9c38-d6bce33a819f-catalog-content\") pod \"a7c911e8-e91f-4c2d-9c38-d6bce33a819f\" (UID: \"a7c911e8-e91f-4c2d-9c38-d6bce33a819f\") " Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.486696 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7c911e8-e91f-4c2d-9c38-d6bce33a819f-utilities" (OuterVolumeSpecName: "utilities") pod "a7c911e8-e91f-4c2d-9c38-d6bce33a819f" (UID: "a7c911e8-e91f-4c2d-9c38-d6bce33a819f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.491799 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7c911e8-e91f-4c2d-9c38-d6bce33a819f-kube-api-access-ssh2h" (OuterVolumeSpecName: "kube-api-access-ssh2h") pod "a7c911e8-e91f-4c2d-9c38-d6bce33a819f" (UID: "a7c911e8-e91f-4c2d-9c38-d6bce33a819f"). InnerVolumeSpecName "kube-api-access-ssh2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.511175 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.520526 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgqzj" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.525971 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rl2n" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.552117 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7c911e8-e91f-4c2d-9c38-d6bce33a819f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7c911e8-e91f-4c2d-9c38-d6bce33a819f" (UID: "a7c911e8-e91f-4c2d-9c38-d6bce33a819f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.573572 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2d6g4" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.587084 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a-catalog-content\") pod \"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a\" (UID: \"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a\") " Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.587128 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a-utilities\") pod \"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a\" (UID: \"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a\") " Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.587174 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29687875-23eb-403d-a89f-eb4d32092d7e-marketplace-trusted-ca\") pod \"29687875-23eb-403d-a89f-eb4d32092d7e\" (UID: \"29687875-23eb-403d-a89f-eb4d32092d7e\") " Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.587192 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bedbed9-581a-414f-a92c-fc4933fcac93-utilities\") pod \"2bedbed9-581a-414f-a92c-fc4933fcac93\" (UID: \"2bedbed9-581a-414f-a92c-fc4933fcac93\") " Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.587213 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bptj\" (UniqueName: \"kubernetes.io/projected/7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a-kube-api-access-5bptj\") pod \"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a\" (UID: \"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a\") " Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.587249 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/29687875-23eb-403d-a89f-eb4d32092d7e-marketplace-operator-metrics\") pod \"29687875-23eb-403d-a89f-eb4d32092d7e\" (UID: \"29687875-23eb-403d-a89f-eb4d32092d7e\") " Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.587269 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf9k8\" (UniqueName: \"kubernetes.io/projected/2bedbed9-581a-414f-a92c-fc4933fcac93-kube-api-access-cf9k8\") pod \"2bedbed9-581a-414f-a92c-fc4933fcac93\" (UID: \"2bedbed9-581a-414f-a92c-fc4933fcac93\") " Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.587319 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm66h\" (UniqueName: \"kubernetes.io/projected/29687875-23eb-403d-a89f-eb4d32092d7e-kube-api-access-rm66h\") pod \"29687875-23eb-403d-a89f-eb4d32092d7e\" (UID: \"29687875-23eb-403d-a89f-eb4d32092d7e\") " Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.587362 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bedbed9-581a-414f-a92c-fc4933fcac93-catalog-content\") pod \"2bedbed9-581a-414f-a92c-fc4933fcac93\" (UID: \"2bedbed9-581a-414f-a92c-fc4933fcac93\") " Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.587590 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssh2h\" (UniqueName: \"kubernetes.io/projected/a7c911e8-e91f-4c2d-9c38-d6bce33a819f-kube-api-access-ssh2h\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.587601 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c911e8-e91f-4c2d-9c38-d6bce33a819f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.587609 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c911e8-e91f-4c2d-9c38-d6bce33a819f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.587649 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29687875-23eb-403d-a89f-eb4d32092d7e-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "29687875-23eb-403d-a89f-eb4d32092d7e" (UID: "29687875-23eb-403d-a89f-eb4d32092d7e"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.588131 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a-utilities" (OuterVolumeSpecName: "utilities") pod "7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" (UID: "7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.588448 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bedbed9-581a-414f-a92c-fc4933fcac93-utilities" (OuterVolumeSpecName: "utilities") pod "2bedbed9-581a-414f-a92c-fc4933fcac93" (UID: "2bedbed9-581a-414f-a92c-fc4933fcac93"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.590511 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a-kube-api-access-5bptj" (OuterVolumeSpecName: "kube-api-access-5bptj") pod "7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" (UID: "7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a"). InnerVolumeSpecName "kube-api-access-5bptj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.590600 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bedbed9-581a-414f-a92c-fc4933fcac93-kube-api-access-cf9k8" (OuterVolumeSpecName: "kube-api-access-cf9k8") pod "2bedbed9-581a-414f-a92c-fc4933fcac93" (UID: "2bedbed9-581a-414f-a92c-fc4933fcac93"). InnerVolumeSpecName "kube-api-access-cf9k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.591356 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29687875-23eb-403d-a89f-eb4d32092d7e-kube-api-access-rm66h" (OuterVolumeSpecName: "kube-api-access-rm66h") pod "29687875-23eb-403d-a89f-eb4d32092d7e" (UID: "29687875-23eb-403d-a89f-eb4d32092d7e"). InnerVolumeSpecName "kube-api-access-rm66h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.591562 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29687875-23eb-403d-a89f-eb4d32092d7e-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "29687875-23eb-403d-a89f-eb4d32092d7e" (UID: "29687875-23eb-403d-a89f-eb4d32092d7e"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.655094 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bedbed9-581a-414f-a92c-fc4933fcac93-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2bedbed9-581a-414f-a92c-fc4933fcac93" (UID: "2bedbed9-581a-414f-a92c-fc4933fcac93"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.689192 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmjxh\" (UniqueName: \"kubernetes.io/projected/7070356d-e89a-4f1c-a247-051bf520ae02-kube-api-access-bmjxh\") pod \"7070356d-e89a-4f1c-a247-051bf520ae02\" (UID: \"7070356d-e89a-4f1c-a247-051bf520ae02\") " Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.689282 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7070356d-e89a-4f1c-a247-051bf520ae02-utilities\") pod \"7070356d-e89a-4f1c-a247-051bf520ae02\" (UID: \"7070356d-e89a-4f1c-a247-051bf520ae02\") " Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.689326 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7070356d-e89a-4f1c-a247-051bf520ae02-catalog-content\") pod \"7070356d-e89a-4f1c-a247-051bf520ae02\" (UID: \"7070356d-e89a-4f1c-a247-051bf520ae02\") " Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.689639 4693 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29687875-23eb-403d-a89f-eb4d32092d7e-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.689657 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bedbed9-581a-414f-a92c-fc4933fcac93-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.689667 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bptj\" (UniqueName: \"kubernetes.io/projected/7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a-kube-api-access-5bptj\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.689677 4693 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/29687875-23eb-403d-a89f-eb4d32092d7e-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.689686 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf9k8\" (UniqueName: \"kubernetes.io/projected/2bedbed9-581a-414f-a92c-fc4933fcac93-kube-api-access-cf9k8\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.689695 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm66h\" (UniqueName: \"kubernetes.io/projected/29687875-23eb-403d-a89f-eb4d32092d7e-kube-api-access-rm66h\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.689704 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bedbed9-581a-414f-a92c-fc4933fcac93-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.689712 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.690583 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7070356d-e89a-4f1c-a247-051bf520ae02-utilities" (OuterVolumeSpecName: "utilities") pod "7070356d-e89a-4f1c-a247-051bf520ae02" (UID: "7070356d-e89a-4f1c-a247-051bf520ae02"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.700019 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7070356d-e89a-4f1c-a247-051bf520ae02-kube-api-access-bmjxh" (OuterVolumeSpecName: "kube-api-access-bmjxh") pod "7070356d-e89a-4f1c-a247-051bf520ae02" (UID: "7070356d-e89a-4f1c-a247-051bf520ae02"). InnerVolumeSpecName "kube-api-access-bmjxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.711323 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" (UID: "7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.713119 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7070356d-e89a-4f1c-a247-051bf520ae02-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7070356d-e89a-4f1c-a247-051bf520ae02" (UID: "7070356d-e89a-4f1c-a247-051bf520ae02"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.790506 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7070356d-e89a-4f1c-a247-051bf520ae02-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.790541 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.790553 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmjxh\" (UniqueName: \"kubernetes.io/projected/7070356d-e89a-4f1c-a247-051bf520ae02-kube-api-access-bmjxh\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:07 crc kubenswrapper[4693]: I1204 09:48:07.790566 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7070356d-e89a-4f1c-a247-051bf520ae02-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.323558 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c7zjm" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.325141 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7zjm" event={"ID":"a7c911e8-e91f-4c2d-9c38-d6bce33a819f","Type":"ContainerDied","Data":"9e4d794da92ccafa900c741e98d3f2b0b53af66fb7b77d4a13ac64f9e8c14b6c"} Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.325194 4693 scope.go:117] "RemoveContainer" containerID="222f992a57089b16283cd25822fb7fda97ee625ca48ce0858a85b2516afb98f1" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.327660 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" event={"ID":"29687875-23eb-403d-a89f-eb4d32092d7e","Type":"ContainerDied","Data":"d0d110af8dd41ee6559d31f4c6ff175be9743720a0daa8ca5338725bb39a38e0"} Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.327751 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jt5h7" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.330087 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2d6g4" event={"ID":"7070356d-e89a-4f1c-a247-051bf520ae02","Type":"ContainerDied","Data":"bb36fb658639b087df9af228a79c625eae9a86a09ce8b379ca20a4d9a507fc5b"} Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.330528 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2d6g4" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.336078 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgqzj" event={"ID":"7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a","Type":"ContainerDied","Data":"10787858b82d57fb7f87b84846a666a2034b611b52ab3f89204097bc6d1281fa"} Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.336128 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgqzj" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.338705 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rl2n" event={"ID":"2bedbed9-581a-414f-a92c-fc4933fcac93","Type":"ContainerDied","Data":"2fc69693333ae9fb07ef965970a82cf7d6b6d46a9657cd58a686237d9781cac5"} Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.338720 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rl2n" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.342497 4693 scope.go:117] "RemoveContainer" containerID="edae1bd887364324f804eee6833761767c24a8164e059114e74a572b436a2e53" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.368623 4693 scope.go:117] "RemoveContainer" containerID="32b7a15cdff06a3f3e0c8efb8af5ac3b758859eb6a1fe35456753a77ab784cf1" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.369051 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c7zjm"] Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.378712 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c7zjm"] Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.384173 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jt5h7"] Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.387424 4693 scope.go:117] "RemoveContainer" containerID="5ea074f2e9de7abe08b01bac6fcba3e76c3c10e1c958067b46d7e6459f8f9d25" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.388467 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jt5h7"] Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.407192 4693 scope.go:117] "RemoveContainer" containerID="9635674fc4bf52b71ddfa0cf7ed55d8004b61d30106030e9db26618f8e381f86" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.428024 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wgqzj"] Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.432462 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wgqzj"] Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.454285 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5rl2n"] Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.454483 4693 scope.go:117] "RemoveContainer" containerID="4a0f8998eea98d262bb44f1c07beb6ff91d86e8d3af2100e0716445466d0b046" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.461110 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5rl2n"] Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.484203 4693 scope.go:117] "RemoveContainer" containerID="ed157ad9534a02a18ade935f9d1c5b2a35dcdc53e5f6d79a1fcd67b7227952e4" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.495707 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29687875-23eb-403d-a89f-eb4d32092d7e" path="/var/lib/kubelet/pods/29687875-23eb-403d-a89f-eb4d32092d7e/volumes" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.496182 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" path="/var/lib/kubelet/pods/2bedbed9-581a-414f-a92c-fc4933fcac93/volumes" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.496769 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" path="/var/lib/kubelet/pods/7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a/volumes" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.497875 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" path="/var/lib/kubelet/pods/a7c911e8-e91f-4c2d-9c38-d6bce33a819f/volumes" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.498390 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2d6g4"] Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.498416 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2d6g4"] Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.500484 4693 scope.go:117] "RemoveContainer" containerID="ad9ccae979860da0683251b0407431fb69c858888d1b76eaf3e620a02ea26e16" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.514572 4693 scope.go:117] "RemoveContainer" containerID="a2ea22eec436822d85593c735a68b6b48c73e91f258b9288e8290a67e3ac159e" Dec 04 09:48:08 crc kubenswrapper[4693]: E1204 09:48:08.518450 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bedbed9_581a_414f_a92c_fc4933fcac93.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b8b70c6_5c0f_4ed1_8b39_43d92bd51c4a.slice/crio-10787858b82d57fb7f87b84846a666a2034b611b52ab3f89204097bc6d1281fa\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29687875_23eb_403d_a89f_eb4d32092d7e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29687875_23eb_403d_a89f_eb4d32092d7e.slice/crio-d0d110af8dd41ee6559d31f4c6ff175be9743720a0daa8ca5338725bb39a38e0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b8b70c6_5c0f_4ed1_8b39_43d92bd51c4a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bedbed9_581a_414f_a92c_fc4933fcac93.slice/crio-2fc69693333ae9fb07ef965970a82cf7d6b6d46a9657cd58a686237d9781cac5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7070356d_e89a_4f1c_a247_051bf520ae02.slice\": RecentStats: unable to find data in memory cache]" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.530549 4693 scope.go:117] "RemoveContainer" containerID="e3f7ca85ddaf3f7a9327b7d4874760517c0b87134b47474d2cf4dcb5b20535b8" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.543813 4693 scope.go:117] "RemoveContainer" containerID="18de67ff61d1c233beed5d267b7a8bb8263a342eafc1750e3f3427561c5d0ea0" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.557544 4693 scope.go:117] "RemoveContainer" containerID="951b458475e15d0a1b8df4e664c6df7a558952840f7f6cce525851b2827dccf6" Dec 04 09:48:08 crc kubenswrapper[4693]: I1204 09:48:08.572558 4693 scope.go:117] "RemoveContainer" containerID="931f810d664bed92f796b0aa180fecaa0bdb98e541abddd20e76338315221e79" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.161601 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xd4kh"] Dec 04 09:48:09 crc kubenswrapper[4693]: E1204 09:48:09.162040 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" containerName="extract-utilities" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.162051 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" containerName="extract-utilities" Dec 04 09:48:09 crc kubenswrapper[4693]: E1204 09:48:09.162064 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" containerName="extract-utilities" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.162070 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" containerName="extract-utilities" Dec 04 09:48:09 crc kubenswrapper[4693]: E1204 09:48:09.162078 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" containerName="registry-server" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.162084 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" containerName="registry-server" Dec 04 09:48:09 crc kubenswrapper[4693]: E1204 09:48:09.162091 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" containerName="extract-utilities" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.162097 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" containerName="extract-utilities" Dec 04 09:48:09 crc kubenswrapper[4693]: E1204 09:48:09.162104 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" containerName="registry-server" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.162110 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" containerName="registry-server" Dec 04 09:48:09 crc kubenswrapper[4693]: E1204 09:48:09.162120 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" containerName="registry-server" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.162128 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" containerName="registry-server" Dec 04 09:48:09 crc kubenswrapper[4693]: E1204 09:48:09.162138 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" containerName="extract-content" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.162143 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" containerName="extract-content" Dec 04 09:48:09 crc kubenswrapper[4693]: E1204 09:48:09.162152 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29687875-23eb-403d-a89f-eb4d32092d7e" containerName="marketplace-operator" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.162157 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="29687875-23eb-403d-a89f-eb4d32092d7e" containerName="marketplace-operator" Dec 04 09:48:09 crc kubenswrapper[4693]: E1204 09:48:09.162163 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" containerName="extract-utilities" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.162172 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" containerName="extract-utilities" Dec 04 09:48:09 crc kubenswrapper[4693]: E1204 09:48:09.162213 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" containerName="extract-content" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.162220 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" containerName="extract-content" Dec 04 09:48:09 crc kubenswrapper[4693]: E1204 09:48:09.162229 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" containerName="extract-content" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.162235 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" containerName="extract-content" Dec 04 09:48:09 crc kubenswrapper[4693]: E1204 09:48:09.162243 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" containerName="registry-server" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.162250 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" containerName="registry-server" Dec 04 09:48:09 crc kubenswrapper[4693]: E1204 09:48:09.162259 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" containerName="extract-content" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.162266 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" containerName="extract-content" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.162361 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b8b70c6-5c0f-4ed1-8b39-43d92bd51c4a" containerName="registry-server" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.162378 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bedbed9-581a-414f-a92c-fc4933fcac93" containerName="registry-server" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.162385 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="29687875-23eb-403d-a89f-eb4d32092d7e" containerName="marketplace-operator" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.162392 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7c911e8-e91f-4c2d-9c38-d6bce33a819f" containerName="registry-server" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.162402 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" containerName="registry-server" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.163096 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xd4kh" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.166140 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.169367 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xd4kh"] Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.206153 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96d24e68-be12-479a-8354-d30848a5b7e1-utilities\") pod \"certified-operators-xd4kh\" (UID: \"96d24e68-be12-479a-8354-d30848a5b7e1\") " pod="openshift-marketplace/certified-operators-xd4kh" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.206191 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96d24e68-be12-479a-8354-d30848a5b7e1-catalog-content\") pod \"certified-operators-xd4kh\" (UID: \"96d24e68-be12-479a-8354-d30848a5b7e1\") " pod="openshift-marketplace/certified-operators-xd4kh" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.206243 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhl8c\" (UniqueName: \"kubernetes.io/projected/96d24e68-be12-479a-8354-d30848a5b7e1-kube-api-access-nhl8c\") pod \"certified-operators-xd4kh\" (UID: \"96d24e68-be12-479a-8354-d30848a5b7e1\") " pod="openshift-marketplace/certified-operators-xd4kh" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.306886 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96d24e68-be12-479a-8354-d30848a5b7e1-utilities\") pod \"certified-operators-xd4kh\" (UID: \"96d24e68-be12-479a-8354-d30848a5b7e1\") " pod="openshift-marketplace/certified-operators-xd4kh" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.306936 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96d24e68-be12-479a-8354-d30848a5b7e1-catalog-content\") pod \"certified-operators-xd4kh\" (UID: \"96d24e68-be12-479a-8354-d30848a5b7e1\") " pod="openshift-marketplace/certified-operators-xd4kh" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.306975 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhl8c\" (UniqueName: \"kubernetes.io/projected/96d24e68-be12-479a-8354-d30848a5b7e1-kube-api-access-nhl8c\") pod \"certified-operators-xd4kh\" (UID: \"96d24e68-be12-479a-8354-d30848a5b7e1\") " pod="openshift-marketplace/certified-operators-xd4kh" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.307495 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96d24e68-be12-479a-8354-d30848a5b7e1-utilities\") pod \"certified-operators-xd4kh\" (UID: \"96d24e68-be12-479a-8354-d30848a5b7e1\") " pod="openshift-marketplace/certified-operators-xd4kh" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.307547 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96d24e68-be12-479a-8354-d30848a5b7e1-catalog-content\") pod \"certified-operators-xd4kh\" (UID: \"96d24e68-be12-479a-8354-d30848a5b7e1\") " pod="openshift-marketplace/certified-operators-xd4kh" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.323217 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhl8c\" (UniqueName: \"kubernetes.io/projected/96d24e68-be12-479a-8354-d30848a5b7e1-kube-api-access-nhl8c\") pod \"certified-operators-xd4kh\" (UID: \"96d24e68-be12-479a-8354-d30848a5b7e1\") " pod="openshift-marketplace/certified-operators-xd4kh" Dec 04 09:48:09 crc kubenswrapper[4693]: I1204 09:48:09.482115 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xd4kh" Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.047571 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xd4kh"] Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.162364 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mm9jb"] Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.166160 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mm9jb" Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.167875 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.173919 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mm9jb"] Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.236020 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df788\" (UniqueName: \"kubernetes.io/projected/12ac24d7-f1d6-48c9-95cb-ac54e898ad99-kube-api-access-df788\") pod \"community-operators-mm9jb\" (UID: \"12ac24d7-f1d6-48c9-95cb-ac54e898ad99\") " pod="openshift-marketplace/community-operators-mm9jb" Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.236078 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ac24d7-f1d6-48c9-95cb-ac54e898ad99-utilities\") pod \"community-operators-mm9jb\" (UID: \"12ac24d7-f1d6-48c9-95cb-ac54e898ad99\") " pod="openshift-marketplace/community-operators-mm9jb" Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.236110 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ac24d7-f1d6-48c9-95cb-ac54e898ad99-catalog-content\") pod \"community-operators-mm9jb\" (UID: \"12ac24d7-f1d6-48c9-95cb-ac54e898ad99\") " pod="openshift-marketplace/community-operators-mm9jb" Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.336841 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df788\" (UniqueName: \"kubernetes.io/projected/12ac24d7-f1d6-48c9-95cb-ac54e898ad99-kube-api-access-df788\") pod \"community-operators-mm9jb\" (UID: \"12ac24d7-f1d6-48c9-95cb-ac54e898ad99\") " pod="openshift-marketplace/community-operators-mm9jb" Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.336891 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ac24d7-f1d6-48c9-95cb-ac54e898ad99-utilities\") pod \"community-operators-mm9jb\" (UID: \"12ac24d7-f1d6-48c9-95cb-ac54e898ad99\") " pod="openshift-marketplace/community-operators-mm9jb" Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.336923 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ac24d7-f1d6-48c9-95cb-ac54e898ad99-catalog-content\") pod \"community-operators-mm9jb\" (UID: \"12ac24d7-f1d6-48c9-95cb-ac54e898ad99\") " pod="openshift-marketplace/community-operators-mm9jb" Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.337437 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ac24d7-f1d6-48c9-95cb-ac54e898ad99-catalog-content\") pod \"community-operators-mm9jb\" (UID: \"12ac24d7-f1d6-48c9-95cb-ac54e898ad99\") " pod="openshift-marketplace/community-operators-mm9jb" Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.337558 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ac24d7-f1d6-48c9-95cb-ac54e898ad99-utilities\") pod \"community-operators-mm9jb\" (UID: \"12ac24d7-f1d6-48c9-95cb-ac54e898ad99\") " pod="openshift-marketplace/community-operators-mm9jb" Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.358110 4693 generic.go:334] "Generic (PLEG): container finished" podID="96d24e68-be12-479a-8354-d30848a5b7e1" containerID="c56bf8a15c81c58bc54ecadbb7cc7e0c5564e34aa47a14f99fe08c0d344dc89b" exitCode=0 Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.358154 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xd4kh" event={"ID":"96d24e68-be12-479a-8354-d30848a5b7e1","Type":"ContainerDied","Data":"c56bf8a15c81c58bc54ecadbb7cc7e0c5564e34aa47a14f99fe08c0d344dc89b"} Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.358216 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xd4kh" event={"ID":"96d24e68-be12-479a-8354-d30848a5b7e1","Type":"ContainerStarted","Data":"36f12ab664d15e03fa5beac1b3af2f4b8766ea98167e30dfadde17860407e5c5"} Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.360743 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df788\" (UniqueName: \"kubernetes.io/projected/12ac24d7-f1d6-48c9-95cb-ac54e898ad99-kube-api-access-df788\") pod \"community-operators-mm9jb\" (UID: \"12ac24d7-f1d6-48c9-95cb-ac54e898ad99\") " pod="openshift-marketplace/community-operators-mm9jb" Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.470767 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7070356d-e89a-4f1c-a247-051bf520ae02" path="/var/lib/kubelet/pods/7070356d-e89a-4f1c-a247-051bf520ae02/volumes" Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.482145 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mm9jb" Dec 04 09:48:10 crc kubenswrapper[4693]: I1204 09:48:10.890794 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mm9jb"] Dec 04 09:48:11 crc kubenswrapper[4693]: I1204 09:48:11.366135 4693 generic.go:334] "Generic (PLEG): container finished" podID="12ac24d7-f1d6-48c9-95cb-ac54e898ad99" containerID="5363f442ab6072589617097871fade2fefd6c87132a5faa760eaedde3e8271db" exitCode=0 Dec 04 09:48:11 crc kubenswrapper[4693]: I1204 09:48:11.366400 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mm9jb" event={"ID":"12ac24d7-f1d6-48c9-95cb-ac54e898ad99","Type":"ContainerDied","Data":"5363f442ab6072589617097871fade2fefd6c87132a5faa760eaedde3e8271db"} Dec 04 09:48:11 crc kubenswrapper[4693]: I1204 09:48:11.366852 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mm9jb" event={"ID":"12ac24d7-f1d6-48c9-95cb-ac54e898ad99","Type":"ContainerStarted","Data":"64533b65db8cf4ec326db5c002d19854b855d3e239549eddeb23d35835aa9249"} Dec 04 09:48:11 crc kubenswrapper[4693]: I1204 09:48:11.373164 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xd4kh" event={"ID":"96d24e68-be12-479a-8354-d30848a5b7e1","Type":"ContainerStarted","Data":"c57c0a5d8238b5e560516f6cf32934dd944b34d829845de667f868a3a7e91da9"} Dec 04 09:48:11 crc kubenswrapper[4693]: I1204 09:48:11.559280 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bt2sw"] Dec 04 09:48:11 crc kubenswrapper[4693]: I1204 09:48:11.560401 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bt2sw" Dec 04 09:48:11 crc kubenswrapper[4693]: I1204 09:48:11.562096 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 04 09:48:11 crc kubenswrapper[4693]: I1204 09:48:11.571131 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bt2sw"] Dec 04 09:48:11 crc kubenswrapper[4693]: I1204 09:48:11.655087 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07fa793f-d0d3-469c-8110-cd6e594d40b9-catalog-content\") pod \"redhat-marketplace-bt2sw\" (UID: \"07fa793f-d0d3-469c-8110-cd6e594d40b9\") " pod="openshift-marketplace/redhat-marketplace-bt2sw" Dec 04 09:48:11 crc kubenswrapper[4693]: I1204 09:48:11.655396 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07fa793f-d0d3-469c-8110-cd6e594d40b9-utilities\") pod \"redhat-marketplace-bt2sw\" (UID: \"07fa793f-d0d3-469c-8110-cd6e594d40b9\") " pod="openshift-marketplace/redhat-marketplace-bt2sw" Dec 04 09:48:11 crc kubenswrapper[4693]: I1204 09:48:11.655487 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mpq5\" (UniqueName: \"kubernetes.io/projected/07fa793f-d0d3-469c-8110-cd6e594d40b9-kube-api-access-7mpq5\") pod \"redhat-marketplace-bt2sw\" (UID: \"07fa793f-d0d3-469c-8110-cd6e594d40b9\") " pod="openshift-marketplace/redhat-marketplace-bt2sw" Dec 04 09:48:11 crc kubenswrapper[4693]: I1204 09:48:11.756921 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07fa793f-d0d3-469c-8110-cd6e594d40b9-utilities\") pod \"redhat-marketplace-bt2sw\" (UID: \"07fa793f-d0d3-469c-8110-cd6e594d40b9\") " pod="openshift-marketplace/redhat-marketplace-bt2sw" Dec 04 09:48:11 crc kubenswrapper[4693]: I1204 09:48:11.756978 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mpq5\" (UniqueName: \"kubernetes.io/projected/07fa793f-d0d3-469c-8110-cd6e594d40b9-kube-api-access-7mpq5\") pod \"redhat-marketplace-bt2sw\" (UID: \"07fa793f-d0d3-469c-8110-cd6e594d40b9\") " pod="openshift-marketplace/redhat-marketplace-bt2sw" Dec 04 09:48:11 crc kubenswrapper[4693]: I1204 09:48:11.757031 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07fa793f-d0d3-469c-8110-cd6e594d40b9-catalog-content\") pod \"redhat-marketplace-bt2sw\" (UID: \"07fa793f-d0d3-469c-8110-cd6e594d40b9\") " pod="openshift-marketplace/redhat-marketplace-bt2sw" Dec 04 09:48:11 crc kubenswrapper[4693]: I1204 09:48:11.757633 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07fa793f-d0d3-469c-8110-cd6e594d40b9-catalog-content\") pod \"redhat-marketplace-bt2sw\" (UID: \"07fa793f-d0d3-469c-8110-cd6e594d40b9\") " pod="openshift-marketplace/redhat-marketplace-bt2sw" Dec 04 09:48:11 crc kubenswrapper[4693]: I1204 09:48:11.757651 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07fa793f-d0d3-469c-8110-cd6e594d40b9-utilities\") pod \"redhat-marketplace-bt2sw\" (UID: \"07fa793f-d0d3-469c-8110-cd6e594d40b9\") " pod="openshift-marketplace/redhat-marketplace-bt2sw" Dec 04 09:48:11 crc kubenswrapper[4693]: I1204 09:48:11.776006 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mpq5\" (UniqueName: \"kubernetes.io/projected/07fa793f-d0d3-469c-8110-cd6e594d40b9-kube-api-access-7mpq5\") pod \"redhat-marketplace-bt2sw\" (UID: \"07fa793f-d0d3-469c-8110-cd6e594d40b9\") " pod="openshift-marketplace/redhat-marketplace-bt2sw" Dec 04 09:48:11 crc kubenswrapper[4693]: I1204 09:48:11.893932 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bt2sw" Dec 04 09:48:12 crc kubenswrapper[4693]: I1204 09:48:12.335287 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bt2sw"] Dec 04 09:48:12 crc kubenswrapper[4693]: W1204 09:48:12.342088 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07fa793f_d0d3_469c_8110_cd6e594d40b9.slice/crio-f679290db93d19ff3aa39058ccf9d3605d58218ba7424aa3bdca1c0c7e889817 WatchSource:0}: Error finding container f679290db93d19ff3aa39058ccf9d3605d58218ba7424aa3bdca1c0c7e889817: Status 404 returned error can't find the container with id f679290db93d19ff3aa39058ccf9d3605d58218ba7424aa3bdca1c0c7e889817 Dec 04 09:48:12 crc kubenswrapper[4693]: I1204 09:48:12.384386 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bt2sw" event={"ID":"07fa793f-d0d3-469c-8110-cd6e594d40b9","Type":"ContainerStarted","Data":"f679290db93d19ff3aa39058ccf9d3605d58218ba7424aa3bdca1c0c7e889817"} Dec 04 09:48:12 crc kubenswrapper[4693]: I1204 09:48:12.562358 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zh65d"] Dec 04 09:48:12 crc kubenswrapper[4693]: I1204 09:48:12.563869 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zh65d" Dec 04 09:48:12 crc kubenswrapper[4693]: I1204 09:48:12.565439 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 04 09:48:12 crc kubenswrapper[4693]: I1204 09:48:12.574222 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dclm\" (UniqueName: \"kubernetes.io/projected/8ec6de5e-a0e3-440b-be6e-336aef2a5a24-kube-api-access-6dclm\") pod \"redhat-operators-zh65d\" (UID: \"8ec6de5e-a0e3-440b-be6e-336aef2a5a24\") " pod="openshift-marketplace/redhat-operators-zh65d" Dec 04 09:48:12 crc kubenswrapper[4693]: I1204 09:48:12.574367 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ec6de5e-a0e3-440b-be6e-336aef2a5a24-catalog-content\") pod \"redhat-operators-zh65d\" (UID: \"8ec6de5e-a0e3-440b-be6e-336aef2a5a24\") " pod="openshift-marketplace/redhat-operators-zh65d" Dec 04 09:48:12 crc kubenswrapper[4693]: I1204 09:48:12.574501 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ec6de5e-a0e3-440b-be6e-336aef2a5a24-utilities\") pod \"redhat-operators-zh65d\" (UID: \"8ec6de5e-a0e3-440b-be6e-336aef2a5a24\") " pod="openshift-marketplace/redhat-operators-zh65d" Dec 04 09:48:12 crc kubenswrapper[4693]: I1204 09:48:12.575215 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zh65d"] Dec 04 09:48:12 crc kubenswrapper[4693]: I1204 09:48:12.676076 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dclm\" (UniqueName: \"kubernetes.io/projected/8ec6de5e-a0e3-440b-be6e-336aef2a5a24-kube-api-access-6dclm\") pod \"redhat-operators-zh65d\" (UID: \"8ec6de5e-a0e3-440b-be6e-336aef2a5a24\") " pod="openshift-marketplace/redhat-operators-zh65d" Dec 04 09:48:12 crc kubenswrapper[4693]: I1204 09:48:12.676146 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ec6de5e-a0e3-440b-be6e-336aef2a5a24-catalog-content\") pod \"redhat-operators-zh65d\" (UID: \"8ec6de5e-a0e3-440b-be6e-336aef2a5a24\") " pod="openshift-marketplace/redhat-operators-zh65d" Dec 04 09:48:12 crc kubenswrapper[4693]: I1204 09:48:12.676200 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ec6de5e-a0e3-440b-be6e-336aef2a5a24-utilities\") pod \"redhat-operators-zh65d\" (UID: \"8ec6de5e-a0e3-440b-be6e-336aef2a5a24\") " pod="openshift-marketplace/redhat-operators-zh65d" Dec 04 09:48:12 crc kubenswrapper[4693]: I1204 09:48:12.676717 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ec6de5e-a0e3-440b-be6e-336aef2a5a24-utilities\") pod \"redhat-operators-zh65d\" (UID: \"8ec6de5e-a0e3-440b-be6e-336aef2a5a24\") " pod="openshift-marketplace/redhat-operators-zh65d" Dec 04 09:48:12 crc kubenswrapper[4693]: I1204 09:48:12.676720 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ec6de5e-a0e3-440b-be6e-336aef2a5a24-catalog-content\") pod \"redhat-operators-zh65d\" (UID: \"8ec6de5e-a0e3-440b-be6e-336aef2a5a24\") " pod="openshift-marketplace/redhat-operators-zh65d" Dec 04 09:48:12 crc kubenswrapper[4693]: I1204 09:48:12.695660 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dclm\" (UniqueName: \"kubernetes.io/projected/8ec6de5e-a0e3-440b-be6e-336aef2a5a24-kube-api-access-6dclm\") pod \"redhat-operators-zh65d\" (UID: \"8ec6de5e-a0e3-440b-be6e-336aef2a5a24\") " pod="openshift-marketplace/redhat-operators-zh65d" Dec 04 09:48:12 crc kubenswrapper[4693]: I1204 09:48:12.917881 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zh65d" Dec 04 09:48:13 crc kubenswrapper[4693]: I1204 09:48:13.361757 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zh65d"] Dec 04 09:48:13 crc kubenswrapper[4693]: I1204 09:48:13.390542 4693 generic.go:334] "Generic (PLEG): container finished" podID="96d24e68-be12-479a-8354-d30848a5b7e1" containerID="c57c0a5d8238b5e560516f6cf32934dd944b34d829845de667f868a3a7e91da9" exitCode=0 Dec 04 09:48:13 crc kubenswrapper[4693]: I1204 09:48:13.390629 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xd4kh" event={"ID":"96d24e68-be12-479a-8354-d30848a5b7e1","Type":"ContainerDied","Data":"c57c0a5d8238b5e560516f6cf32934dd944b34d829845de667f868a3a7e91da9"} Dec 04 09:48:13 crc kubenswrapper[4693]: I1204 09:48:13.392442 4693 generic.go:334] "Generic (PLEG): container finished" podID="07fa793f-d0d3-469c-8110-cd6e594d40b9" containerID="2d696eb12d1584f20987a903a0b6da4a2dabe4cf0d99d32ed5f6a4b2a3046f86" exitCode=0 Dec 04 09:48:13 crc kubenswrapper[4693]: I1204 09:48:13.392496 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bt2sw" event={"ID":"07fa793f-d0d3-469c-8110-cd6e594d40b9","Type":"ContainerDied","Data":"2d696eb12d1584f20987a903a0b6da4a2dabe4cf0d99d32ed5f6a4b2a3046f86"} Dec 04 09:48:13 crc kubenswrapper[4693]: I1204 09:48:13.393527 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh65d" event={"ID":"8ec6de5e-a0e3-440b-be6e-336aef2a5a24","Type":"ContainerStarted","Data":"fbc47548acc10236ddd465744cb3894c35812578b07f4e7cfa62b3c20c7f752e"} Dec 04 09:48:14 crc kubenswrapper[4693]: I1204 09:48:14.398880 4693 generic.go:334] "Generic (PLEG): container finished" podID="8ec6de5e-a0e3-440b-be6e-336aef2a5a24" containerID="1bfc4451fbc090e863a94377008621c8079732b4b340b37e5fa6444431aed362" exitCode=0 Dec 04 09:48:14 crc kubenswrapper[4693]: I1204 09:48:14.399067 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh65d" event={"ID":"8ec6de5e-a0e3-440b-be6e-336aef2a5a24","Type":"ContainerDied","Data":"1bfc4451fbc090e863a94377008621c8079732b4b340b37e5fa6444431aed362"} Dec 04 09:48:15 crc kubenswrapper[4693]: I1204 09:48:15.406246 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xd4kh" event={"ID":"96d24e68-be12-479a-8354-d30848a5b7e1","Type":"ContainerStarted","Data":"98e6282f27842e3794765034ec81953920f1bf91a92c18ed94e747cd71695f2a"} Dec 04 09:48:15 crc kubenswrapper[4693]: I1204 09:48:15.408684 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bt2sw" event={"ID":"07fa793f-d0d3-469c-8110-cd6e594d40b9","Type":"ContainerStarted","Data":"e3b032799802e9c8f4a1d4d734193f6d9f04ebad11d7ad8b5ba56790781b6bee"} Dec 04 09:48:15 crc kubenswrapper[4693]: I1204 09:48:15.419532 4693 generic.go:334] "Generic (PLEG): container finished" podID="12ac24d7-f1d6-48c9-95cb-ac54e898ad99" containerID="a8028694e20c27d25b7b5f22e239dcfc01e783e22808bb5777fae54274e730ed" exitCode=0 Dec 04 09:48:15 crc kubenswrapper[4693]: I1204 09:48:15.419605 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mm9jb" event={"ID":"12ac24d7-f1d6-48c9-95cb-ac54e898ad99","Type":"ContainerDied","Data":"a8028694e20c27d25b7b5f22e239dcfc01e783e22808bb5777fae54274e730ed"} Dec 04 09:48:15 crc kubenswrapper[4693]: I1204 09:48:15.425632 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xd4kh" podStartSLOduration=1.845211777 podStartE2EDuration="6.425618864s" podCreationTimestamp="2025-12-04 09:48:09 +0000 UTC" firstStartedPulling="2025-12-04 09:48:10.360790528 +0000 UTC m=+336.258384281" lastFinishedPulling="2025-12-04 09:48:14.941197615 +0000 UTC m=+340.838791368" observedRunningTime="2025-12-04 09:48:15.421833178 +0000 UTC m=+341.319426931" watchObservedRunningTime="2025-12-04 09:48:15.425618864 +0000 UTC m=+341.323212607" Dec 04 09:48:15 crc kubenswrapper[4693]: I1204 09:48:15.428417 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh65d" event={"ID":"8ec6de5e-a0e3-440b-be6e-336aef2a5a24","Type":"ContainerStarted","Data":"3887a82f19f5ea3e5ca40c8e03ca5f8b10a7209a79c3917c7173ec9ec70210f2"} Dec 04 09:48:16 crc kubenswrapper[4693]: I1204 09:48:16.435987 4693 generic.go:334] "Generic (PLEG): container finished" podID="8ec6de5e-a0e3-440b-be6e-336aef2a5a24" containerID="3887a82f19f5ea3e5ca40c8e03ca5f8b10a7209a79c3917c7173ec9ec70210f2" exitCode=0 Dec 04 09:48:16 crc kubenswrapper[4693]: I1204 09:48:16.436213 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh65d" event={"ID":"8ec6de5e-a0e3-440b-be6e-336aef2a5a24","Type":"ContainerDied","Data":"3887a82f19f5ea3e5ca40c8e03ca5f8b10a7209a79c3917c7173ec9ec70210f2"} Dec 04 09:48:16 crc kubenswrapper[4693]: I1204 09:48:16.438175 4693 generic.go:334] "Generic (PLEG): container finished" podID="07fa793f-d0d3-469c-8110-cd6e594d40b9" containerID="e3b032799802e9c8f4a1d4d734193f6d9f04ebad11d7ad8b5ba56790781b6bee" exitCode=0 Dec 04 09:48:16 crc kubenswrapper[4693]: I1204 09:48:16.438196 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bt2sw" event={"ID":"07fa793f-d0d3-469c-8110-cd6e594d40b9","Type":"ContainerDied","Data":"e3b032799802e9c8f4a1d4d734193f6d9f04ebad11d7ad8b5ba56790781b6bee"} Dec 04 09:48:17 crc kubenswrapper[4693]: I1204 09:48:17.445894 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bt2sw" event={"ID":"07fa793f-d0d3-469c-8110-cd6e594d40b9","Type":"ContainerStarted","Data":"d44d6fe91c1842eabe4ac96a75744fe7fabcb1f7e7cb1bff4df78da23cf0ea71"} Dec 04 09:48:17 crc kubenswrapper[4693]: I1204 09:48:17.447854 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mm9jb" event={"ID":"12ac24d7-f1d6-48c9-95cb-ac54e898ad99","Type":"ContainerStarted","Data":"cf8b0a2f3ef989c8980a7c0d3c265d12311493161aafe1d4a09e64fe0f42ecc5"} Dec 04 09:48:17 crc kubenswrapper[4693]: I1204 09:48:17.451939 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh65d" event={"ID":"8ec6de5e-a0e3-440b-be6e-336aef2a5a24","Type":"ContainerStarted","Data":"a4665da2f7bc6afba2cfbeb03b6f819b9d2def01f10e7f14924f7c57ecc3f6bd"} Dec 04 09:48:17 crc kubenswrapper[4693]: I1204 09:48:17.493126 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mm9jb" podStartSLOduration=3.048429426 podStartE2EDuration="7.493103604s" podCreationTimestamp="2025-12-04 09:48:10 +0000 UTC" firstStartedPulling="2025-12-04 09:48:11.368322191 +0000 UTC m=+337.265915944" lastFinishedPulling="2025-12-04 09:48:15.812996369 +0000 UTC m=+341.710590122" observedRunningTime="2025-12-04 09:48:17.484535032 +0000 UTC m=+343.382128785" watchObservedRunningTime="2025-12-04 09:48:17.493103604 +0000 UTC m=+343.390697357" Dec 04 09:48:17 crc kubenswrapper[4693]: I1204 09:48:17.494491 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bt2sw" podStartSLOduration=4.084965538 podStartE2EDuration="6.494480776s" podCreationTimestamp="2025-12-04 09:48:11 +0000 UTC" firstStartedPulling="2025-12-04 09:48:14.401221924 +0000 UTC m=+340.298815677" lastFinishedPulling="2025-12-04 09:48:16.810737152 +0000 UTC m=+342.708330915" observedRunningTime="2025-12-04 09:48:17.466705006 +0000 UTC m=+343.364298759" watchObservedRunningTime="2025-12-04 09:48:17.494480776 +0000 UTC m=+343.392074529" Dec 04 09:48:17 crc kubenswrapper[4693]: I1204 09:48:17.518428 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zh65d" podStartSLOduration=3.07946619 podStartE2EDuration="5.518408279s" podCreationTimestamp="2025-12-04 09:48:12 +0000 UTC" firstStartedPulling="2025-12-04 09:48:14.401764841 +0000 UTC m=+340.299358604" lastFinishedPulling="2025-12-04 09:48:16.84070694 +0000 UTC m=+342.738300693" observedRunningTime="2025-12-04 09:48:17.515590853 +0000 UTC m=+343.413184616" watchObservedRunningTime="2025-12-04 09:48:17.518408279 +0000 UTC m=+343.416002032" Dec 04 09:48:19 crc kubenswrapper[4693]: I1204 09:48:19.482920 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xd4kh" Dec 04 09:48:19 crc kubenswrapper[4693]: I1204 09:48:19.483294 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xd4kh" Dec 04 09:48:19 crc kubenswrapper[4693]: I1204 09:48:19.529482 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xd4kh" Dec 04 09:48:20 crc kubenswrapper[4693]: I1204 09:48:20.483196 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mm9jb" Dec 04 09:48:20 crc kubenswrapper[4693]: I1204 09:48:20.484407 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mm9jb" Dec 04 09:48:20 crc kubenswrapper[4693]: I1204 09:48:20.517107 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xd4kh" Dec 04 09:48:20 crc kubenswrapper[4693]: I1204 09:48:20.527923 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mm9jb" Dec 04 09:48:21 crc kubenswrapper[4693]: I1204 09:48:21.515307 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mm9jb" Dec 04 09:48:21 crc kubenswrapper[4693]: I1204 09:48:21.894684 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bt2sw" Dec 04 09:48:21 crc kubenswrapper[4693]: I1204 09:48:21.894776 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bt2sw" Dec 04 09:48:21 crc kubenswrapper[4693]: I1204 09:48:21.954921 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bt2sw" Dec 04 09:48:22 crc kubenswrapper[4693]: I1204 09:48:22.523379 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bt2sw" Dec 04 09:48:22 crc kubenswrapper[4693]: I1204 09:48:22.919075 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zh65d" Dec 04 09:48:22 crc kubenswrapper[4693]: I1204 09:48:22.919128 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zh65d" Dec 04 09:48:22 crc kubenswrapper[4693]: I1204 09:48:22.964725 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zh65d" Dec 04 09:48:23 crc kubenswrapper[4693]: I1204 09:48:23.531079 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zh65d" Dec 04 09:48:25 crc kubenswrapper[4693]: I1204 09:48:25.120594 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" podUID="3ce889a4-48b5-429d-8d0e-fc270a53385b" containerName="registry" containerID="cri-o://3087f3281fa40b31afa9506d0bb4b2fc3ece5f3180d1339d8fe43d648f58f9ec" gracePeriod=30 Dec 04 09:48:25 crc kubenswrapper[4693]: I1204 09:48:25.497427 4693 generic.go:334] "Generic (PLEG): container finished" podID="3ce889a4-48b5-429d-8d0e-fc270a53385b" containerID="3087f3281fa40b31afa9506d0bb4b2fc3ece5f3180d1339d8fe43d648f58f9ec" exitCode=0 Dec 04 09:48:25 crc kubenswrapper[4693]: I1204 09:48:25.497467 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" event={"ID":"3ce889a4-48b5-429d-8d0e-fc270a53385b","Type":"ContainerDied","Data":"3087f3281fa40b31afa9506d0bb4b2fc3ece5f3180d1339d8fe43d648f58f9ec"} Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.417296 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.518959 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" event={"ID":"3ce889a4-48b5-429d-8d0e-fc270a53385b","Type":"ContainerDied","Data":"c800ca3f53b9611a85ce21d51b60eacc2b88b6526a6b6e1062acb2bf57f42427"} Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.519035 4693 scope.go:117] "RemoveContainer" containerID="3087f3281fa40b31afa9506d0bb4b2fc3ece5f3180d1339d8fe43d648f58f9ec" Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.519153 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mnzz8" Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.532912 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3ce889a4-48b5-429d-8d0e-fc270a53385b-registry-certificates\") pod \"3ce889a4-48b5-429d-8d0e-fc270a53385b\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.532979 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ce889a4-48b5-429d-8d0e-fc270a53385b-registry-tls\") pod \"3ce889a4-48b5-429d-8d0e-fc270a53385b\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.533010 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ce889a4-48b5-429d-8d0e-fc270a53385b-bound-sa-token\") pod \"3ce889a4-48b5-429d-8d0e-fc270a53385b\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.533029 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3ce889a4-48b5-429d-8d0e-fc270a53385b-ca-trust-extracted\") pod \"3ce889a4-48b5-429d-8d0e-fc270a53385b\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.533053 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59sj2\" (UniqueName: \"kubernetes.io/projected/3ce889a4-48b5-429d-8d0e-fc270a53385b-kube-api-access-59sj2\") pod \"3ce889a4-48b5-429d-8d0e-fc270a53385b\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.533096 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3ce889a4-48b5-429d-8d0e-fc270a53385b-installation-pull-secrets\") pod \"3ce889a4-48b5-429d-8d0e-fc270a53385b\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.533128 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ce889a4-48b5-429d-8d0e-fc270a53385b-trusted-ca\") pod \"3ce889a4-48b5-429d-8d0e-fc270a53385b\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.533308 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"3ce889a4-48b5-429d-8d0e-fc270a53385b\" (UID: \"3ce889a4-48b5-429d-8d0e-fc270a53385b\") " Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.533887 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ce889a4-48b5-429d-8d0e-fc270a53385b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3ce889a4-48b5-429d-8d0e-fc270a53385b" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.534390 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ce889a4-48b5-429d-8d0e-fc270a53385b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3ce889a4-48b5-429d-8d0e-fc270a53385b" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.538813 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ce889a4-48b5-429d-8d0e-fc270a53385b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3ce889a4-48b5-429d-8d0e-fc270a53385b" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.539055 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ce889a4-48b5-429d-8d0e-fc270a53385b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3ce889a4-48b5-429d-8d0e-fc270a53385b" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.539293 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ce889a4-48b5-429d-8d0e-fc270a53385b-kube-api-access-59sj2" (OuterVolumeSpecName: "kube-api-access-59sj2") pod "3ce889a4-48b5-429d-8d0e-fc270a53385b" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b"). InnerVolumeSpecName "kube-api-access-59sj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.541297 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce889a4-48b5-429d-8d0e-fc270a53385b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3ce889a4-48b5-429d-8d0e-fc270a53385b" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.552633 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ce889a4-48b5-429d-8d0e-fc270a53385b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3ce889a4-48b5-429d-8d0e-fc270a53385b" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.634483 4693 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3ce889a4-48b5-429d-8d0e-fc270a53385b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.634512 4693 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ce889a4-48b5-429d-8d0e-fc270a53385b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.634523 4693 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ce889a4-48b5-429d-8d0e-fc270a53385b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.634531 4693 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3ce889a4-48b5-429d-8d0e-fc270a53385b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.634540 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59sj2\" (UniqueName: \"kubernetes.io/projected/3ce889a4-48b5-429d-8d0e-fc270a53385b-kube-api-access-59sj2\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.634548 4693 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3ce889a4-48b5-429d-8d0e-fc270a53385b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.634559 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ce889a4-48b5-429d-8d0e-fc270a53385b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.648229 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "3ce889a4-48b5-429d-8d0e-fc270a53385b" (UID: "3ce889a4-48b5-429d-8d0e-fc270a53385b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.853171 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mnzz8"] Dec 04 09:48:29 crc kubenswrapper[4693]: I1204 09:48:29.858758 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mnzz8"] Dec 04 09:48:30 crc kubenswrapper[4693]: I1204 09:48:30.468600 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ce889a4-48b5-429d-8d0e-fc270a53385b" path="/var/lib/kubelet/pods/3ce889a4-48b5-429d-8d0e-fc270a53385b/volumes" Dec 04 09:48:52 crc kubenswrapper[4693]: I1204 09:48:52.273523 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:48:52 crc kubenswrapper[4693]: I1204 09:48:52.274026 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:49:22 crc kubenswrapper[4693]: I1204 09:49:22.273319 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:49:22 crc kubenswrapper[4693]: I1204 09:49:22.274515 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:49:52 crc kubenswrapper[4693]: I1204 09:49:52.273598 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:49:52 crc kubenswrapper[4693]: I1204 09:49:52.275158 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:49:52 crc kubenswrapper[4693]: I1204 09:49:52.275280 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 09:49:52 crc kubenswrapper[4693]: I1204 09:49:52.275960 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e5c3376928d63e92c6ec3a4e0e41e8231361da3b96d1164b5ed95a2e7d0a788d"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 09:49:52 crc kubenswrapper[4693]: I1204 09:49:52.276094 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://e5c3376928d63e92c6ec3a4e0e41e8231361da3b96d1164b5ed95a2e7d0a788d" gracePeriod=600 Dec 04 09:49:53 crc kubenswrapper[4693]: I1204 09:49:53.070696 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="e5c3376928d63e92c6ec3a4e0e41e8231361da3b96d1164b5ed95a2e7d0a788d" exitCode=0 Dec 04 09:49:53 crc kubenswrapper[4693]: I1204 09:49:53.070817 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"e5c3376928d63e92c6ec3a4e0e41e8231361da3b96d1164b5ed95a2e7d0a788d"} Dec 04 09:49:53 crc kubenswrapper[4693]: I1204 09:49:53.071053 4693 scope.go:117] "RemoveContainer" containerID="dc3eb2c670d736e622f5e389adf244a416496825b95513b2ffb4548e91e3ce8c" Dec 04 09:49:54 crc kubenswrapper[4693]: I1204 09:49:54.077468 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"6ce848f5ee271fc4a0487df8df9f65b5687a848bd213275ce74ef26bcb2100f7"} Dec 04 09:52:22 crc kubenswrapper[4693]: I1204 09:52:22.272614 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:52:22 crc kubenswrapper[4693]: I1204 09:52:22.273028 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:52:37 crc kubenswrapper[4693]: I1204 09:52:37.742713 4693 scope.go:117] "RemoveContainer" containerID="838d029d10876585be32f081854df97ff2cc484d18c24834321f465c3362a503" Dec 04 09:52:52 crc kubenswrapper[4693]: I1204 09:52:52.273527 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:52:52 crc kubenswrapper[4693]: I1204 09:52:52.274033 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:53:22 crc kubenswrapper[4693]: I1204 09:53:22.273297 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:53:22 crc kubenswrapper[4693]: I1204 09:53:22.273790 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:53:22 crc kubenswrapper[4693]: I1204 09:53:22.273835 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 09:53:22 crc kubenswrapper[4693]: I1204 09:53:22.274463 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ce848f5ee271fc4a0487df8df9f65b5687a848bd213275ce74ef26bcb2100f7"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 09:53:22 crc kubenswrapper[4693]: I1204 09:53:22.274524 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://6ce848f5ee271fc4a0487df8df9f65b5687a848bd213275ce74ef26bcb2100f7" gracePeriod=600 Dec 04 09:53:22 crc kubenswrapper[4693]: E1204 09:53:22.403316 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4f65408_7d18_47db_8a19_f9be435dd348.slice/crio-6ce848f5ee271fc4a0487df8df9f65b5687a848bd213275ce74ef26bcb2100f7.scope\": RecentStats: unable to find data in memory cache]" Dec 04 09:53:23 crc kubenswrapper[4693]: I1204 09:53:23.326505 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="6ce848f5ee271fc4a0487df8df9f65b5687a848bd213275ce74ef26bcb2100f7" exitCode=0 Dec 04 09:53:23 crc kubenswrapper[4693]: I1204 09:53:23.326611 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"6ce848f5ee271fc4a0487df8df9f65b5687a848bd213275ce74ef26bcb2100f7"} Dec 04 09:53:23 crc kubenswrapper[4693]: I1204 09:53:23.327011 4693 scope.go:117] "RemoveContainer" containerID="e5c3376928d63e92c6ec3a4e0e41e8231361da3b96d1164b5ed95a2e7d0a788d" Dec 04 09:53:24 crc kubenswrapper[4693]: I1204 09:53:24.334204 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"1c4f05ea0deb5052be910dd2a2555e8c09134b71b724016170b902ec2aaa9b89"} Dec 04 09:53:33 crc kubenswrapper[4693]: I1204 09:53:33.154556 4693 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.812796 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qn5sx"] Dec 04 09:54:27 crc kubenswrapper[4693]: E1204 09:54:27.814273 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ce889a4-48b5-429d-8d0e-fc270a53385b" containerName="registry" Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.814299 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ce889a4-48b5-429d-8d0e-fc270a53385b" containerName="registry" Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.814470 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ce889a4-48b5-429d-8d0e-fc270a53385b" containerName="registry" Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.815140 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-qn5sx" Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.815868 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-qc5kn"] Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.816731 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-qc5kn" Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.817837 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.817862 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.818435 4693 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-sqqmh" Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.818541 4693 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-jk5wf" Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.826801 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qn5sx"] Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.834688 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-qc5kn"] Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.843420 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-9zv9m"] Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.844134 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-9zv9m" Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.850200 4693 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-jr442" Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.859856 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-9zv9m"] Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.872581 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2572h\" (UniqueName: \"kubernetes.io/projected/e8fb930d-7df9-4d8f-8edf-e5ae3ef734ce-kube-api-access-2572h\") pod \"cert-manager-5b446d88c5-qc5kn\" (UID: \"e8fb930d-7df9-4d8f-8edf-e5ae3ef734ce\") " pod="cert-manager/cert-manager-5b446d88c5-qc5kn" Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.872764 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csm4q\" (UniqueName: \"kubernetes.io/projected/5f2c58ea-f0fb-4460-9794-64d3182b3b5f-kube-api-access-csm4q\") pod \"cert-manager-cainjector-7f985d654d-qn5sx\" (UID: \"5f2c58ea-f0fb-4460-9794-64d3182b3b5f\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qn5sx" Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.974859 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrsq7\" (UniqueName: \"kubernetes.io/projected/df9e8e44-fcb6-48e4-abc7-cb16efbb64bd-kube-api-access-rrsq7\") pod \"cert-manager-webhook-5655c58dd6-9zv9m\" (UID: \"df9e8e44-fcb6-48e4-abc7-cb16efbb64bd\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-9zv9m" Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.974989 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2572h\" (UniqueName: \"kubernetes.io/projected/e8fb930d-7df9-4d8f-8edf-e5ae3ef734ce-kube-api-access-2572h\") pod \"cert-manager-5b446d88c5-qc5kn\" (UID: \"e8fb930d-7df9-4d8f-8edf-e5ae3ef734ce\") " pod="cert-manager/cert-manager-5b446d88c5-qc5kn" Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.975081 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csm4q\" (UniqueName: \"kubernetes.io/projected/5f2c58ea-f0fb-4460-9794-64d3182b3b5f-kube-api-access-csm4q\") pod \"cert-manager-cainjector-7f985d654d-qn5sx\" (UID: \"5f2c58ea-f0fb-4460-9794-64d3182b3b5f\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qn5sx" Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.997433 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csm4q\" (UniqueName: \"kubernetes.io/projected/5f2c58ea-f0fb-4460-9794-64d3182b3b5f-kube-api-access-csm4q\") pod \"cert-manager-cainjector-7f985d654d-qn5sx\" (UID: \"5f2c58ea-f0fb-4460-9794-64d3182b3b5f\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qn5sx" Dec 04 09:54:27 crc kubenswrapper[4693]: I1204 09:54:27.997495 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2572h\" (UniqueName: \"kubernetes.io/projected/e8fb930d-7df9-4d8f-8edf-e5ae3ef734ce-kube-api-access-2572h\") pod \"cert-manager-5b446d88c5-qc5kn\" (UID: \"e8fb930d-7df9-4d8f-8edf-e5ae3ef734ce\") " pod="cert-manager/cert-manager-5b446d88c5-qc5kn" Dec 04 09:54:28 crc kubenswrapper[4693]: I1204 09:54:28.076937 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrsq7\" (UniqueName: \"kubernetes.io/projected/df9e8e44-fcb6-48e4-abc7-cb16efbb64bd-kube-api-access-rrsq7\") pod \"cert-manager-webhook-5655c58dd6-9zv9m\" (UID: \"df9e8e44-fcb6-48e4-abc7-cb16efbb64bd\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-9zv9m" Dec 04 09:54:28 crc kubenswrapper[4693]: I1204 09:54:28.094838 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrsq7\" (UniqueName: \"kubernetes.io/projected/df9e8e44-fcb6-48e4-abc7-cb16efbb64bd-kube-api-access-rrsq7\") pod \"cert-manager-webhook-5655c58dd6-9zv9m\" (UID: \"df9e8e44-fcb6-48e4-abc7-cb16efbb64bd\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-9zv9m" Dec 04 09:54:28 crc kubenswrapper[4693]: I1204 09:54:28.145485 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-qn5sx" Dec 04 09:54:28 crc kubenswrapper[4693]: I1204 09:54:28.161121 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-qc5kn" Dec 04 09:54:28 crc kubenswrapper[4693]: I1204 09:54:28.178407 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-9zv9m" Dec 04 09:54:28 crc kubenswrapper[4693]: I1204 09:54:28.380323 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qn5sx"] Dec 04 09:54:28 crc kubenswrapper[4693]: I1204 09:54:28.392550 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 09:54:28 crc kubenswrapper[4693]: I1204 09:54:28.606980 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-qc5kn"] Dec 04 09:54:28 crc kubenswrapper[4693]: I1204 09:54:28.613479 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-9zv9m"] Dec 04 09:54:28 crc kubenswrapper[4693]: W1204 09:54:28.615600 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8fb930d_7df9_4d8f_8edf_e5ae3ef734ce.slice/crio-98e59848a1745ca890645f565ed0f8db0831ac92cbd6a8fdf99b7ada1557b5d8 WatchSource:0}: Error finding container 98e59848a1745ca890645f565ed0f8db0831ac92cbd6a8fdf99b7ada1557b5d8: Status 404 returned error can't find the container with id 98e59848a1745ca890645f565ed0f8db0831ac92cbd6a8fdf99b7ada1557b5d8 Dec 04 09:54:28 crc kubenswrapper[4693]: I1204 09:54:28.683776 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-qn5sx" event={"ID":"5f2c58ea-f0fb-4460-9794-64d3182b3b5f","Type":"ContainerStarted","Data":"500be5dc30a5ecb2e591cc8325ea13006c5d107ddf08cf8807f894becfef5f86"} Dec 04 09:54:28 crc kubenswrapper[4693]: I1204 09:54:28.684743 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-qc5kn" event={"ID":"e8fb930d-7df9-4d8f-8edf-e5ae3ef734ce","Type":"ContainerStarted","Data":"98e59848a1745ca890645f565ed0f8db0831ac92cbd6a8fdf99b7ada1557b5d8"} Dec 04 09:54:28 crc kubenswrapper[4693]: I1204 09:54:28.685465 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-9zv9m" event={"ID":"df9e8e44-fcb6-48e4-abc7-cb16efbb64bd","Type":"ContainerStarted","Data":"d6856c13b09578095103f29b1dfd578cd219f1d8f2c8f76d9c332bf90765be18"} Dec 04 09:54:31 crc kubenswrapper[4693]: I1204 09:54:31.704944 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-9zv9m" event={"ID":"df9e8e44-fcb6-48e4-abc7-cb16efbb64bd","Type":"ContainerStarted","Data":"8ffec5e5ecd99546f3b03fb2a925dfe1bac21a958d6f405cf3e2eee8d33a276a"} Dec 04 09:54:31 crc kubenswrapper[4693]: I1204 09:54:31.705303 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-9zv9m" Dec 04 09:54:31 crc kubenswrapper[4693]: I1204 09:54:31.706273 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-qn5sx" event={"ID":"5f2c58ea-f0fb-4460-9794-64d3182b3b5f","Type":"ContainerStarted","Data":"3c6a78a72be6f440166e4b75182c999184b2231b57e3917db86a88b34761b671"} Dec 04 09:54:31 crc kubenswrapper[4693]: I1204 09:54:31.719073 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-9zv9m" podStartSLOduration=2.502465393 podStartE2EDuration="4.719057819s" podCreationTimestamp="2025-12-04 09:54:27 +0000 UTC" firstStartedPulling="2025-12-04 09:54:28.622158501 +0000 UTC m=+714.519752264" lastFinishedPulling="2025-12-04 09:54:30.838750927 +0000 UTC m=+716.736344690" observedRunningTime="2025-12-04 09:54:31.717071671 +0000 UTC m=+717.614665434" watchObservedRunningTime="2025-12-04 09:54:31.719057819 +0000 UTC m=+717.616651572" Dec 04 09:54:31 crc kubenswrapper[4693]: I1204 09:54:31.738800 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-qn5sx" podStartSLOduration=2.345194373 podStartE2EDuration="4.738759805s" podCreationTimestamp="2025-12-04 09:54:27 +0000 UTC" firstStartedPulling="2025-12-04 09:54:28.392187602 +0000 UTC m=+714.289781355" lastFinishedPulling="2025-12-04 09:54:30.785753024 +0000 UTC m=+716.683346787" observedRunningTime="2025-12-04 09:54:31.73644622 +0000 UTC m=+717.634039993" watchObservedRunningTime="2025-12-04 09:54:31.738759805 +0000 UTC m=+717.636353558" Dec 04 09:54:32 crc kubenswrapper[4693]: I1204 09:54:32.713359 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-qc5kn" event={"ID":"e8fb930d-7df9-4d8f-8edf-e5ae3ef734ce","Type":"ContainerStarted","Data":"6ced852d559033f2ee917ab9968fc67eb692654a5c70e4767ed859544bd01d08"} Dec 04 09:54:32 crc kubenswrapper[4693]: I1204 09:54:32.737161 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-qc5kn" podStartSLOduration=2.461983704 podStartE2EDuration="5.737130618s" podCreationTimestamp="2025-12-04 09:54:27 +0000 UTC" firstStartedPulling="2025-12-04 09:54:28.617598863 +0000 UTC m=+714.515192616" lastFinishedPulling="2025-12-04 09:54:31.892745777 +0000 UTC m=+717.790339530" observedRunningTime="2025-12-04 09:54:32.734005175 +0000 UTC m=+718.631598928" watchObservedRunningTime="2025-12-04 09:54:32.737130618 +0000 UTC m=+718.634724371" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.181476 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-9zv9m" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.398943 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wm5mt"] Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.399404 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="ovn-controller" containerID="cri-o://c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1" gracePeriod=30 Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.399487 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="nbdb" containerID="cri-o://84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6" gracePeriod=30 Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.399532 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="northd" containerID="cri-o://2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005" gracePeriod=30 Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.399584 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c" gracePeriod=30 Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.399611 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="sbdb" containerID="cri-o://9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf" gracePeriod=30 Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.399625 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="kube-rbac-proxy-node" containerID="cri-o://9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82" gracePeriod=30 Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.399667 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="ovn-acl-logging" containerID="cri-o://1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb" gracePeriod=30 Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.425474 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="ovnkube-controller" containerID="cri-o://d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2" gracePeriod=30 Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.690170 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wm5mt_d6e969b8-31f1-4fbf-9597-16349612e0c0/ovn-acl-logging/0.log" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.690886 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wm5mt_d6e969b8-31f1-4fbf-9597-16349612e0c0/ovn-controller/0.log" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.691370 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.713682 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-etc-openvswitch\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.713732 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-log-socket\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.713760 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-cni-netd\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.713797 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4tdx\" (UniqueName: \"kubernetes.io/projected/d6e969b8-31f1-4fbf-9597-16349612e0c0-kube-api-access-x4tdx\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.713824 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-run-netns\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.713852 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d6e969b8-31f1-4fbf-9597-16349612e0c0-env-overrides\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.713878 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-run-systemd\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.713906 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d6e969b8-31f1-4fbf-9597-16349612e0c0-ovnkube-script-lib\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.713929 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-cni-bin\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.713961 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-run-ovn\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.713989 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d6e969b8-31f1-4fbf-9597-16349612e0c0-ovn-node-metrics-cert\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714016 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714038 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-kubelet\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714058 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-var-lib-openvswitch\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714088 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-slash\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714107 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-run-openvswitch\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714126 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-systemd-units\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714174 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d6e969b8-31f1-4fbf-9597-16349612e0c0-ovnkube-config\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714207 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-node-log\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714230 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-run-ovn-kubernetes\") pod \"d6e969b8-31f1-4fbf-9597-16349612e0c0\" (UID: \"d6e969b8-31f1-4fbf-9597-16349612e0c0\") " Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.713836 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.713917 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-log-socket" (OuterVolumeSpecName: "log-socket") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.713983 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714449 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714543 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714569 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714588 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714605 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714608 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e969b8-31f1-4fbf-9597-16349612e0c0-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714658 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714687 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714711 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714734 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-slash" (OuterVolumeSpecName: "host-slash") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714756 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714839 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e969b8-31f1-4fbf-9597-16349612e0c0-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.714869 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-node-log" (OuterVolumeSpecName: "node-log") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.715070 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e969b8-31f1-4fbf-9597-16349612e0c0-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.719873 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e969b8-31f1-4fbf-9597-16349612e0c0-kube-api-access-x4tdx" (OuterVolumeSpecName: "kube-api-access-x4tdx") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "kube-api-access-x4tdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.720909 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e969b8-31f1-4fbf-9597-16349612e0c0-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.731013 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d6e969b8-31f1-4fbf-9597-16349612e0c0" (UID: "d6e969b8-31f1-4fbf-9597-16349612e0c0"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.743187 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8h228"] Dec 04 09:54:38 crc kubenswrapper[4693]: E1204 09:54:38.743799 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="ovn-acl-logging" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.743881 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="ovn-acl-logging" Dec 04 09:54:38 crc kubenswrapper[4693]: E1204 09:54:38.743954 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="northd" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.744024 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="northd" Dec 04 09:54:38 crc kubenswrapper[4693]: E1204 09:54:38.744117 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.744192 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 09:54:38 crc kubenswrapper[4693]: E1204 09:54:38.744267 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="ovnkube-controller" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.744381 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="ovnkube-controller" Dec 04 09:54:38 crc kubenswrapper[4693]: E1204 09:54:38.744453 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="sbdb" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.745006 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="sbdb" Dec 04 09:54:38 crc kubenswrapper[4693]: E1204 09:54:38.745070 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="ovn-controller" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.745121 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="ovn-controller" Dec 04 09:54:38 crc kubenswrapper[4693]: E1204 09:54:38.745173 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="kube-rbac-proxy-node" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.745219 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="kube-rbac-proxy-node" Dec 04 09:54:38 crc kubenswrapper[4693]: E1204 09:54:38.745271 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="nbdb" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.745322 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="nbdb" Dec 04 09:54:38 crc kubenswrapper[4693]: E1204 09:54:38.745476 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="kubecfg-setup" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.745571 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="kubecfg-setup" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.745751 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.745827 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="ovn-acl-logging" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.745892 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="kube-rbac-proxy-node" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.745961 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="northd" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.746031 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="nbdb" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.746094 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="ovnkube-controller" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.746159 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="sbdb" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.746214 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerName="ovn-controller" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.747941 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.752547 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wm5mt_d6e969b8-31f1-4fbf-9597-16349612e0c0/ovn-acl-logging/0.log" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.753431 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wm5mt_d6e969b8-31f1-4fbf-9597-16349612e0c0/ovn-controller/0.log" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.753826 4693 generic.go:334] "Generic (PLEG): container finished" podID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerID="d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2" exitCode=0 Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.753864 4693 generic.go:334] "Generic (PLEG): container finished" podID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerID="9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf" exitCode=0 Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.753873 4693 generic.go:334] "Generic (PLEG): container finished" podID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerID="84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6" exitCode=0 Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.753881 4693 generic.go:334] "Generic (PLEG): container finished" podID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerID="2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005" exitCode=0 Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.753891 4693 generic.go:334] "Generic (PLEG): container finished" podID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerID="036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c" exitCode=0 Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.753900 4693 generic.go:334] "Generic (PLEG): container finished" podID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerID="9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82" exitCode=0 Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.753909 4693 generic.go:334] "Generic (PLEG): container finished" podID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerID="1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb" exitCode=143 Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.753917 4693 generic.go:334] "Generic (PLEG): container finished" podID="d6e969b8-31f1-4fbf-9597-16349612e0c0" containerID="c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1" exitCode=143 Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.753945 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" event={"ID":"d6e969b8-31f1-4fbf-9597-16349612e0c0","Type":"ContainerDied","Data":"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.753982 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" event={"ID":"d6e969b8-31f1-4fbf-9597-16349612e0c0","Type":"ContainerDied","Data":"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.753994 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" event={"ID":"d6e969b8-31f1-4fbf-9597-16349612e0c0","Type":"ContainerDied","Data":"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754005 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" event={"ID":"d6e969b8-31f1-4fbf-9597-16349612e0c0","Type":"ContainerDied","Data":"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754023 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754065 4693 scope.go:117] "RemoveContainer" containerID="d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754170 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" event={"ID":"d6e969b8-31f1-4fbf-9597-16349612e0c0","Type":"ContainerDied","Data":"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754191 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" event={"ID":"d6e969b8-31f1-4fbf-9597-16349612e0c0","Type":"ContainerDied","Data":"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754205 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754218 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754226 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754236 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" event={"ID":"d6e969b8-31f1-4fbf-9597-16349612e0c0","Type":"ContainerDied","Data":"1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754247 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754256 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754263 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754271 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754278 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754287 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754294 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754301 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754308 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754320 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" event={"ID":"d6e969b8-31f1-4fbf-9597-16349612e0c0","Type":"ContainerDied","Data":"c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754349 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754360 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754367 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754374 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754380 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754387 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754393 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754400 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754408 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754418 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wm5mt" event={"ID":"d6e969b8-31f1-4fbf-9597-16349612e0c0","Type":"ContainerDied","Data":"704672773ab4756574dd3547af6d35bee3707bdfc1c172c7674bcd936e9beaf8"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754430 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754438 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754444 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754450 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754454 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754459 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754465 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754469 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.754475 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.755749 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zb44s_2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4/kube-multus/0.log" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.755774 4693 generic.go:334] "Generic (PLEG): container finished" podID="2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4" containerID="0e15b6f8f7d5c95ccd143a1d4a5f1a03a398802a52b93e7061c9c4d7e0318cc0" exitCode=2 Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.755790 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zb44s" event={"ID":"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4","Type":"ContainerDied","Data":"0e15b6f8f7d5c95ccd143a1d4a5f1a03a398802a52b93e7061c9c4d7e0318cc0"} Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.756223 4693 scope.go:117] "RemoveContainer" containerID="0e15b6f8f7d5c95ccd143a1d4a5f1a03a398802a52b93e7061c9c4d7e0318cc0" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.786608 4693 scope.go:117] "RemoveContainer" containerID="9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.808628 4693 scope.go:117] "RemoveContainer" containerID="84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815230 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-var-lib-openvswitch\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815269 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-run-openvswitch\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815292 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-kubelet\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815308 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-log-socket\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815322 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82xkh\" (UniqueName: \"kubernetes.io/projected/bffdd9a9-d990-4395-859a-5b2d08d15964-kube-api-access-82xkh\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815349 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wm5mt"] Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815354 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bffdd9a9-d990-4395-859a-5b2d08d15964-ovnkube-config\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815438 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-slash\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815468 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-run-systemd\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815495 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-run-ovn-kubernetes\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815510 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bffdd9a9-d990-4395-859a-5b2d08d15964-env-overrides\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815528 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-cni-netd\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815589 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815650 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-node-log\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815697 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-cni-bin\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815741 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-systemd-units\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815756 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-run-netns\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815775 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-etc-openvswitch\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815794 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bffdd9a9-d990-4395-859a-5b2d08d15964-ovnkube-script-lib\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815849 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bffdd9a9-d990-4395-859a-5b2d08d15964-ovn-node-metrics-cert\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815896 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-run-ovn\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815963 4693 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815980 4693 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d6e969b8-31f1-4fbf-9597-16349612e0c0-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815990 4693 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.815998 4693 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.816006 4693 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d6e969b8-31f1-4fbf-9597-16349612e0c0-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.816014 4693 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.816023 4693 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.816031 4693 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.816042 4693 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-slash\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.816052 4693 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.816060 4693 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.816069 4693 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d6e969b8-31f1-4fbf-9597-16349612e0c0-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.816077 4693 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-node-log\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.816085 4693 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.816095 4693 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.816103 4693 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-log-socket\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.816110 4693 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.816118 4693 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d6e969b8-31f1-4fbf-9597-16349612e0c0-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.816127 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4tdx\" (UniqueName: \"kubernetes.io/projected/d6e969b8-31f1-4fbf-9597-16349612e0c0-kube-api-access-x4tdx\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.816136 4693 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d6e969b8-31f1-4fbf-9597-16349612e0c0-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.820773 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wm5mt"] Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.822982 4693 scope.go:117] "RemoveContainer" containerID="2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.839717 4693 scope.go:117] "RemoveContainer" containerID="036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.863547 4693 scope.go:117] "RemoveContainer" containerID="9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.876465 4693 scope.go:117] "RemoveContainer" containerID="1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.889913 4693 scope.go:117] "RemoveContainer" containerID="c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.905538 4693 scope.go:117] "RemoveContainer" containerID="d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.916775 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bffdd9a9-d990-4395-859a-5b2d08d15964-ovnkube-script-lib\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.916818 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bffdd9a9-d990-4395-859a-5b2d08d15964-ovn-node-metrics-cert\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.916858 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-run-ovn\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.916882 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-var-lib-openvswitch\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.916901 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-kubelet\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.916917 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-run-openvswitch\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.916933 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-log-socket\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.916950 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82xkh\" (UniqueName: \"kubernetes.io/projected/bffdd9a9-d990-4395-859a-5b2d08d15964-kube-api-access-82xkh\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.916965 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bffdd9a9-d990-4395-859a-5b2d08d15964-ovnkube-config\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.916982 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-slash\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917001 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-run-systemd\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917018 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-run-ovn-kubernetes\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917032 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bffdd9a9-d990-4395-859a-5b2d08d15964-env-overrides\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917047 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-cni-netd\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917063 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917085 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-node-log\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917100 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-cni-bin\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917119 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-systemd-units\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917133 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-run-netns\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917150 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-etc-openvswitch\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917212 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-etc-openvswitch\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917450 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917458 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-run-systemd\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917493 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-run-ovn\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917510 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-log-socket\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917545 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-cni-bin\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917550 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-slash\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917572 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-var-lib-openvswitch\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917590 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-node-log\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917586 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-systemd-units\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917598 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-run-netns\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917644 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-run-ovn-kubernetes\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917608 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-kubelet\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917641 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-host-cni-netd\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.917613 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bffdd9a9-d990-4395-859a-5b2d08d15964-run-openvswitch\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.918037 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bffdd9a9-d990-4395-859a-5b2d08d15964-env-overrides\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.918181 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bffdd9a9-d990-4395-859a-5b2d08d15964-ovnkube-config\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.918874 4693 scope.go:117] "RemoveContainer" containerID="d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2" Dec 04 09:54:38 crc kubenswrapper[4693]: E1204 09:54:38.919202 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2\": container with ID starting with d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2 not found: ID does not exist" containerID="d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.919232 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2"} err="failed to get container status \"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2\": rpc error: code = NotFound desc = could not find container \"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2\": container with ID starting with d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.919255 4693 scope.go:117] "RemoveContainer" containerID="9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf" Dec 04 09:54:38 crc kubenswrapper[4693]: E1204 09:54:38.919701 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf\": container with ID starting with 9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf not found: ID does not exist" containerID="9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.919752 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf"} err="failed to get container status \"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf\": rpc error: code = NotFound desc = could not find container \"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf\": container with ID starting with 9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.919785 4693 scope.go:117] "RemoveContainer" containerID="84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.919970 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bffdd9a9-d990-4395-859a-5b2d08d15964-ovnkube-script-lib\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: E1204 09:54:38.920393 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6\": container with ID starting with 84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6 not found: ID does not exist" containerID="84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.920437 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6"} err="failed to get container status \"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6\": rpc error: code = NotFound desc = could not find container \"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6\": container with ID starting with 84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.920462 4693 scope.go:117] "RemoveContainer" containerID="2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005" Dec 04 09:54:38 crc kubenswrapper[4693]: E1204 09:54:38.920969 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005\": container with ID starting with 2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005 not found: ID does not exist" containerID="2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.921015 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005"} err="failed to get container status \"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005\": rpc error: code = NotFound desc = could not find container \"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005\": container with ID starting with 2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.921049 4693 scope.go:117] "RemoveContainer" containerID="036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c" Dec 04 09:54:38 crc kubenswrapper[4693]: E1204 09:54:38.921737 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c\": container with ID starting with 036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c not found: ID does not exist" containerID="036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.921763 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c"} err="failed to get container status \"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c\": rpc error: code = NotFound desc = could not find container \"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c\": container with ID starting with 036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.921778 4693 scope.go:117] "RemoveContainer" containerID="9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82" Dec 04 09:54:38 crc kubenswrapper[4693]: E1204 09:54:38.921973 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82\": container with ID starting with 9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82 not found: ID does not exist" containerID="9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.922001 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82"} err="failed to get container status \"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82\": rpc error: code = NotFound desc = could not find container \"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82\": container with ID starting with 9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.922018 4693 scope.go:117] "RemoveContainer" containerID="1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.922192 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bffdd9a9-d990-4395-859a-5b2d08d15964-ovn-node-metrics-cert\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:38 crc kubenswrapper[4693]: E1204 09:54:38.922196 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb\": container with ID starting with 1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb not found: ID does not exist" containerID="1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.922247 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb"} err="failed to get container status \"1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb\": rpc error: code = NotFound desc = could not find container \"1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb\": container with ID starting with 1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.922271 4693 scope.go:117] "RemoveContainer" containerID="c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1" Dec 04 09:54:38 crc kubenswrapper[4693]: E1204 09:54:38.922512 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1\": container with ID starting with c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1 not found: ID does not exist" containerID="c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.922538 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1"} err="failed to get container status \"c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1\": rpc error: code = NotFound desc = could not find container \"c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1\": container with ID starting with c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.922553 4693 scope.go:117] "RemoveContainer" containerID="d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d" Dec 04 09:54:38 crc kubenswrapper[4693]: E1204 09:54:38.922708 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d\": container with ID starting with d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d not found: ID does not exist" containerID="d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.922728 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d"} err="failed to get container status \"d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d\": rpc error: code = NotFound desc = could not find container \"d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d\": container with ID starting with d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.922745 4693 scope.go:117] "RemoveContainer" containerID="d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.922880 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2"} err="failed to get container status \"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2\": rpc error: code = NotFound desc = could not find container \"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2\": container with ID starting with d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.922904 4693 scope.go:117] "RemoveContainer" containerID="9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.923042 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf"} err="failed to get container status \"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf\": rpc error: code = NotFound desc = could not find container \"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf\": container with ID starting with 9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.923062 4693 scope.go:117] "RemoveContainer" containerID="84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.923195 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6"} err="failed to get container status \"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6\": rpc error: code = NotFound desc = could not find container \"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6\": container with ID starting with 84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.923216 4693 scope.go:117] "RemoveContainer" containerID="2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.923366 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005"} err="failed to get container status \"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005\": rpc error: code = NotFound desc = could not find container \"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005\": container with ID starting with 2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.923418 4693 scope.go:117] "RemoveContainer" containerID="036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.923589 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c"} err="failed to get container status \"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c\": rpc error: code = NotFound desc = could not find container \"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c\": container with ID starting with 036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.923612 4693 scope.go:117] "RemoveContainer" containerID="9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.923809 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82"} err="failed to get container status \"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82\": rpc error: code = NotFound desc = could not find container \"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82\": container with ID starting with 9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.923831 4693 scope.go:117] "RemoveContainer" containerID="1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.924018 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb"} err="failed to get container status \"1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb\": rpc error: code = NotFound desc = could not find container \"1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb\": container with ID starting with 1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.924048 4693 scope.go:117] "RemoveContainer" containerID="c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.924233 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1"} err="failed to get container status \"c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1\": rpc error: code = NotFound desc = could not find container \"c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1\": container with ID starting with c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.924251 4693 scope.go:117] "RemoveContainer" containerID="d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.924414 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d"} err="failed to get container status \"d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d\": rpc error: code = NotFound desc = could not find container \"d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d\": container with ID starting with d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.924434 4693 scope.go:117] "RemoveContainer" containerID="d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.924578 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2"} err="failed to get container status \"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2\": rpc error: code = NotFound desc = could not find container \"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2\": container with ID starting with d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.924599 4693 scope.go:117] "RemoveContainer" containerID="9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.924763 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf"} err="failed to get container status \"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf\": rpc error: code = NotFound desc = could not find container \"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf\": container with ID starting with 9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.924780 4693 scope.go:117] "RemoveContainer" containerID="84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.924952 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6"} err="failed to get container status \"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6\": rpc error: code = NotFound desc = could not find container \"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6\": container with ID starting with 84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.925030 4693 scope.go:117] "RemoveContainer" containerID="2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.925177 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005"} err="failed to get container status \"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005\": rpc error: code = NotFound desc = could not find container \"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005\": container with ID starting with 2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.925201 4693 scope.go:117] "RemoveContainer" containerID="036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.925396 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c"} err="failed to get container status \"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c\": rpc error: code = NotFound desc = could not find container \"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c\": container with ID starting with 036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.925428 4693 scope.go:117] "RemoveContainer" containerID="9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.925601 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82"} err="failed to get container status \"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82\": rpc error: code = NotFound desc = could not find container \"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82\": container with ID starting with 9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.925622 4693 scope.go:117] "RemoveContainer" containerID="1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.925762 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb"} err="failed to get container status \"1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb\": rpc error: code = NotFound desc = could not find container \"1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb\": container with ID starting with 1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.925779 4693 scope.go:117] "RemoveContainer" containerID="c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.925920 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1"} err="failed to get container status \"c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1\": rpc error: code = NotFound desc = could not find container \"c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1\": container with ID starting with c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.925941 4693 scope.go:117] "RemoveContainer" containerID="d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.926084 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d"} err="failed to get container status \"d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d\": rpc error: code = NotFound desc = could not find container \"d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d\": container with ID starting with d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.926103 4693 scope.go:117] "RemoveContainer" containerID="d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.926246 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2"} err="failed to get container status \"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2\": rpc error: code = NotFound desc = could not find container \"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2\": container with ID starting with d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.926267 4693 scope.go:117] "RemoveContainer" containerID="9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.926434 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf"} err="failed to get container status \"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf\": rpc error: code = NotFound desc = could not find container \"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf\": container with ID starting with 9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.926454 4693 scope.go:117] "RemoveContainer" containerID="84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.926589 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6"} err="failed to get container status \"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6\": rpc error: code = NotFound desc = could not find container \"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6\": container with ID starting with 84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.926611 4693 scope.go:117] "RemoveContainer" containerID="2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.926749 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005"} err="failed to get container status \"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005\": rpc error: code = NotFound desc = could not find container \"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005\": container with ID starting with 2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.926767 4693 scope.go:117] "RemoveContainer" containerID="036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.926894 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c"} err="failed to get container status \"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c\": rpc error: code = NotFound desc = could not find container \"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c\": container with ID starting with 036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.926912 4693 scope.go:117] "RemoveContainer" containerID="9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.927036 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82"} err="failed to get container status \"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82\": rpc error: code = NotFound desc = could not find container \"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82\": container with ID starting with 9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.927053 4693 scope.go:117] "RemoveContainer" containerID="1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.927211 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb"} err="failed to get container status \"1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb\": rpc error: code = NotFound desc = could not find container \"1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb\": container with ID starting with 1ce046eed05f16c89f74a54931b121f02f824140c2a7a5edf78a9030b675fabb not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.927233 4693 scope.go:117] "RemoveContainer" containerID="c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.927411 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1"} err="failed to get container status \"c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1\": rpc error: code = NotFound desc = could not find container \"c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1\": container with ID starting with c7c90bb1fdcab9d71dceea202d2f1a132d27299194f089e986290727033944e1 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.927429 4693 scope.go:117] "RemoveContainer" containerID="d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.927571 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d"} err="failed to get container status \"d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d\": rpc error: code = NotFound desc = could not find container \"d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d\": container with ID starting with d096d796ff5b77a28b4150189463a9eeeea39d17f1306bf950814a3b15ffc23d not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.927587 4693 scope.go:117] "RemoveContainer" containerID="d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.927726 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2"} err="failed to get container status \"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2\": rpc error: code = NotFound desc = could not find container \"d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2\": container with ID starting with d32061acd9bbabc041549824b2d1d598fd143fd37257738cea530a72bbf052b2 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.927744 4693 scope.go:117] "RemoveContainer" containerID="9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.928523 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf"} err="failed to get container status \"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf\": rpc error: code = NotFound desc = could not find container \"9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf\": container with ID starting with 9a81001e54bd13b754d479047f4b1c0a7188e37488130227fa6c023e8a8d82cf not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.928551 4693 scope.go:117] "RemoveContainer" containerID="84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.928734 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6"} err="failed to get container status \"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6\": rpc error: code = NotFound desc = could not find container \"84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6\": container with ID starting with 84d8815e70154060cfbcba9b2199d0e850c819150ffd90b38990a1a83a53dcb6 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.928756 4693 scope.go:117] "RemoveContainer" containerID="2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.929348 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005"} err="failed to get container status \"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005\": rpc error: code = NotFound desc = could not find container \"2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005\": container with ID starting with 2711fa4c58342e0b4f1f68efcb139e0bee15059ded23f076071c9d316df68005 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.929368 4693 scope.go:117] "RemoveContainer" containerID="036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.929529 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c"} err="failed to get container status \"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c\": rpc error: code = NotFound desc = could not find container \"036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c\": container with ID starting with 036177531dac19e9e955767a98855cee2d4ddf7e38f2353a234131bd91b2ec9c not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.929547 4693 scope.go:117] "RemoveContainer" containerID="9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.929690 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82"} err="failed to get container status \"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82\": rpc error: code = NotFound desc = could not find container \"9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82\": container with ID starting with 9073d0407418e2a8e130fef2b81eee5b2f17a3218c34fe9aa93f7038f8596f82 not found: ID does not exist" Dec 04 09:54:38 crc kubenswrapper[4693]: I1204 09:54:38.935813 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82xkh\" (UniqueName: \"kubernetes.io/projected/bffdd9a9-d990-4395-859a-5b2d08d15964-kube-api-access-82xkh\") pod \"ovnkube-node-8h228\" (UID: \"bffdd9a9-d990-4395-859a-5b2d08d15964\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:39 crc kubenswrapper[4693]: I1204 09:54:39.061308 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:39 crc kubenswrapper[4693]: W1204 09:54:39.077534 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbffdd9a9_d990_4395_859a_5b2d08d15964.slice/crio-09345184a2890cf3468cd3c90b78390016a0d1f1ad76474d077a41ebd2d8d2fd WatchSource:0}: Error finding container 09345184a2890cf3468cd3c90b78390016a0d1f1ad76474d077a41ebd2d8d2fd: Status 404 returned error can't find the container with id 09345184a2890cf3468cd3c90b78390016a0d1f1ad76474d077a41ebd2d8d2fd Dec 04 09:54:39 crc kubenswrapper[4693]: I1204 09:54:39.764376 4693 generic.go:334] "Generic (PLEG): container finished" podID="bffdd9a9-d990-4395-859a-5b2d08d15964" containerID="654d71f2b6c611ddd1e80350ee0d90c52416c2969de832aaa9a77b56598c3c04" exitCode=0 Dec 04 09:54:39 crc kubenswrapper[4693]: I1204 09:54:39.764477 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h228" event={"ID":"bffdd9a9-d990-4395-859a-5b2d08d15964","Type":"ContainerDied","Data":"654d71f2b6c611ddd1e80350ee0d90c52416c2969de832aaa9a77b56598c3c04"} Dec 04 09:54:39 crc kubenswrapper[4693]: I1204 09:54:39.764744 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h228" event={"ID":"bffdd9a9-d990-4395-859a-5b2d08d15964","Type":"ContainerStarted","Data":"09345184a2890cf3468cd3c90b78390016a0d1f1ad76474d077a41ebd2d8d2fd"} Dec 04 09:54:39 crc kubenswrapper[4693]: I1204 09:54:39.772590 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zb44s_2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4/kube-multus/0.log" Dec 04 09:54:39 crc kubenswrapper[4693]: I1204 09:54:39.772657 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zb44s" event={"ID":"2a56f61e-5b1a-4a14-9cc6-3c9e6eb78be4","Type":"ContainerStarted","Data":"de422f73349f584942c651f854d64bf96ad73a43bdda1e39af33d6811cc275ea"} Dec 04 09:54:40 crc kubenswrapper[4693]: I1204 09:54:40.468124 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6e969b8-31f1-4fbf-9597-16349612e0c0" path="/var/lib/kubelet/pods/d6e969b8-31f1-4fbf-9597-16349612e0c0/volumes" Dec 04 09:54:40 crc kubenswrapper[4693]: I1204 09:54:40.784590 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h228" event={"ID":"bffdd9a9-d990-4395-859a-5b2d08d15964","Type":"ContainerStarted","Data":"410508caf408ebf070083f41c92bbc337d14e67ba99e4bb1f2662fbf6972ccaf"} Dec 04 09:54:40 crc kubenswrapper[4693]: I1204 09:54:40.784629 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h228" event={"ID":"bffdd9a9-d990-4395-859a-5b2d08d15964","Type":"ContainerStarted","Data":"643ea80725eaeb44bbba4e14bcc4b19d39ff7873f9edc139741dc6bba813edb6"} Dec 04 09:54:40 crc kubenswrapper[4693]: I1204 09:54:40.784644 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h228" event={"ID":"bffdd9a9-d990-4395-859a-5b2d08d15964","Type":"ContainerStarted","Data":"fcdcf5c37b9cee8c3d7127fd99634de3b30939acc7725f5ef2583eb733aeb77d"} Dec 04 09:54:40 crc kubenswrapper[4693]: I1204 09:54:40.784655 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h228" event={"ID":"bffdd9a9-d990-4395-859a-5b2d08d15964","Type":"ContainerStarted","Data":"917e4043f4c56dc0182306d0485b55ea2ad5888b03357f081a34aa3da518c9ad"} Dec 04 09:54:40 crc kubenswrapper[4693]: I1204 09:54:40.784666 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h228" event={"ID":"bffdd9a9-d990-4395-859a-5b2d08d15964","Type":"ContainerStarted","Data":"9ea418e9dbe59637102354bfe93f4441a2e48fe9c95a577541fe8bb709e93ce9"} Dec 04 09:54:40 crc kubenswrapper[4693]: I1204 09:54:40.784681 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h228" event={"ID":"bffdd9a9-d990-4395-859a-5b2d08d15964","Type":"ContainerStarted","Data":"c027507aaaa5f7adac73596d99881f8b344e99c10e88dcf24f758f2301df80b1"} Dec 04 09:54:43 crc kubenswrapper[4693]: I1204 09:54:43.809133 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h228" event={"ID":"bffdd9a9-d990-4395-859a-5b2d08d15964","Type":"ContainerStarted","Data":"e8423da59a58cc3f58b1f26a3bada22b3201b5a0870f0f484e5159568d471d48"} Dec 04 09:54:46 crc kubenswrapper[4693]: I1204 09:54:46.834290 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h228" event={"ID":"bffdd9a9-d990-4395-859a-5b2d08d15964","Type":"ContainerStarted","Data":"c2e12db73748e7e45b1be13cb101792a2d499ca1ca2106400931944d59be9185"} Dec 04 09:54:46 crc kubenswrapper[4693]: I1204 09:54:46.834911 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:46 crc kubenswrapper[4693]: I1204 09:54:46.834934 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:46 crc kubenswrapper[4693]: I1204 09:54:46.834950 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:46 crc kubenswrapper[4693]: I1204 09:54:46.869216 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:46 crc kubenswrapper[4693]: I1204 09:54:46.870392 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:54:46 crc kubenswrapper[4693]: I1204 09:54:46.875600 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8h228" podStartSLOduration=8.875583631 podStartE2EDuration="8.875583631s" podCreationTimestamp="2025-12-04 09:54:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:54:46.874074985 +0000 UTC m=+732.771668738" watchObservedRunningTime="2025-12-04 09:54:46.875583631 +0000 UTC m=+732.773177384" Dec 04 09:55:02 crc kubenswrapper[4693]: I1204 09:55:02.230178 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph"] Dec 04 09:55:02 crc kubenswrapper[4693]: I1204 09:55:02.231351 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Dec 04 09:55:02 crc kubenswrapper[4693]: I1204 09:55:02.233038 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 04 09:55:02 crc kubenswrapper[4693]: I1204 09:55:02.233320 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 04 09:55:02 crc kubenswrapper[4693]: I1204 09:55:02.233491 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5vl77" Dec 04 09:55:02 crc kubenswrapper[4693]: I1204 09:55:02.320768 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/5f0438dc-0fdb-48e2-a807-3292d8bb3fed-run\") pod \"ceph\" (UID: \"5f0438dc-0fdb-48e2-a807-3292d8bb3fed\") " pod="openstack/ceph" Dec 04 09:55:02 crc kubenswrapper[4693]: I1204 09:55:02.320814 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/5f0438dc-0fdb-48e2-a807-3292d8bb3fed-log\") pod \"ceph\" (UID: \"5f0438dc-0fdb-48e2-a807-3292d8bb3fed\") " pod="openstack/ceph" Dec 04 09:55:02 crc kubenswrapper[4693]: I1204 09:55:02.320843 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l8kp\" (UniqueName: \"kubernetes.io/projected/5f0438dc-0fdb-48e2-a807-3292d8bb3fed-kube-api-access-6l8kp\") pod \"ceph\" (UID: \"5f0438dc-0fdb-48e2-a807-3292d8bb3fed\") " pod="openstack/ceph" Dec 04 09:55:02 crc kubenswrapper[4693]: I1204 09:55:02.320869 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5f0438dc-0fdb-48e2-a807-3292d8bb3fed-data\") pod \"ceph\" (UID: \"5f0438dc-0fdb-48e2-a807-3292d8bb3fed\") " pod="openstack/ceph" Dec 04 09:55:02 crc kubenswrapper[4693]: I1204 09:55:02.421726 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l8kp\" (UniqueName: \"kubernetes.io/projected/5f0438dc-0fdb-48e2-a807-3292d8bb3fed-kube-api-access-6l8kp\") pod \"ceph\" (UID: \"5f0438dc-0fdb-48e2-a807-3292d8bb3fed\") " pod="openstack/ceph" Dec 04 09:55:02 crc kubenswrapper[4693]: I1204 09:55:02.421798 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5f0438dc-0fdb-48e2-a807-3292d8bb3fed-data\") pod \"ceph\" (UID: \"5f0438dc-0fdb-48e2-a807-3292d8bb3fed\") " pod="openstack/ceph" Dec 04 09:55:02 crc kubenswrapper[4693]: I1204 09:55:02.421897 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/5f0438dc-0fdb-48e2-a807-3292d8bb3fed-run\") pod \"ceph\" (UID: \"5f0438dc-0fdb-48e2-a807-3292d8bb3fed\") " pod="openstack/ceph" Dec 04 09:55:02 crc kubenswrapper[4693]: I1204 09:55:02.421926 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/5f0438dc-0fdb-48e2-a807-3292d8bb3fed-log\") pod \"ceph\" (UID: \"5f0438dc-0fdb-48e2-a807-3292d8bb3fed\") " pod="openstack/ceph" Dec 04 09:55:02 crc kubenswrapper[4693]: I1204 09:55:02.422464 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/5f0438dc-0fdb-48e2-a807-3292d8bb3fed-log\") pod \"ceph\" (UID: \"5f0438dc-0fdb-48e2-a807-3292d8bb3fed\") " pod="openstack/ceph" Dec 04 09:55:02 crc kubenswrapper[4693]: I1204 09:55:02.422496 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5f0438dc-0fdb-48e2-a807-3292d8bb3fed-data\") pod \"ceph\" (UID: \"5f0438dc-0fdb-48e2-a807-3292d8bb3fed\") " pod="openstack/ceph" Dec 04 09:55:02 crc kubenswrapper[4693]: I1204 09:55:02.422479 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/5f0438dc-0fdb-48e2-a807-3292d8bb3fed-run\") pod \"ceph\" (UID: \"5f0438dc-0fdb-48e2-a807-3292d8bb3fed\") " pod="openstack/ceph" Dec 04 09:55:02 crc kubenswrapper[4693]: I1204 09:55:02.443946 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l8kp\" (UniqueName: \"kubernetes.io/projected/5f0438dc-0fdb-48e2-a807-3292d8bb3fed-kube-api-access-6l8kp\") pod \"ceph\" (UID: \"5f0438dc-0fdb-48e2-a807-3292d8bb3fed\") " pod="openstack/ceph" Dec 04 09:55:02 crc kubenswrapper[4693]: I1204 09:55:02.544873 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Dec 04 09:55:02 crc kubenswrapper[4693]: W1204 09:55:02.564289 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f0438dc_0fdb_48e2_a807_3292d8bb3fed.slice/crio-971a5c04e4f5a0d98a7e200c1dffab475b60c55a1a3bfded33f621494fc8a075 WatchSource:0}: Error finding container 971a5c04e4f5a0d98a7e200c1dffab475b60c55a1a3bfded33f621494fc8a075: Status 404 returned error can't find the container with id 971a5c04e4f5a0d98a7e200c1dffab475b60c55a1a3bfded33f621494fc8a075 Dec 04 09:55:02 crc kubenswrapper[4693]: I1204 09:55:02.930988 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"5f0438dc-0fdb-48e2-a807-3292d8bb3fed","Type":"ContainerStarted","Data":"971a5c04e4f5a0d98a7e200c1dffab475b60c55a1a3bfded33f621494fc8a075"} Dec 04 09:55:09 crc kubenswrapper[4693]: I1204 09:55:09.138937 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8h228" Dec 04 09:55:20 crc kubenswrapper[4693]: I1204 09:55:20.046793 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"5f0438dc-0fdb-48e2-a807-3292d8bb3fed","Type":"ContainerStarted","Data":"c1cf876d1b85be0920d1f5ea8f0216a6101f23f1eef93b56c74c2fddf2014960"} Dec 04 09:55:20 crc kubenswrapper[4693]: I1204 09:55:20.072462 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph" podStartSLOduration=1.796369673 podStartE2EDuration="18.072438526s" podCreationTimestamp="2025-12-04 09:55:02 +0000 UTC" firstStartedPulling="2025-12-04 09:55:02.567055187 +0000 UTC m=+748.464648940" lastFinishedPulling="2025-12-04 09:55:18.84312403 +0000 UTC m=+764.740717793" observedRunningTime="2025-12-04 09:55:20.069131888 +0000 UTC m=+765.966725641" watchObservedRunningTime="2025-12-04 09:55:20.072438526 +0000 UTC m=+765.970032309" Dec 04 09:55:52 crc kubenswrapper[4693]: I1204 09:55:52.272963 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:55:52 crc kubenswrapper[4693]: I1204 09:55:52.273576 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:56:22 crc kubenswrapper[4693]: I1204 09:56:22.273029 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:56:22 crc kubenswrapper[4693]: I1204 09:56:22.273590 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:56:33 crc kubenswrapper[4693]: I1204 09:56:33.433313 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-64bk5"] Dec 04 09:56:33 crc kubenswrapper[4693]: I1204 09:56:33.435199 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64bk5" Dec 04 09:56:33 crc kubenswrapper[4693]: I1204 09:56:33.474152 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-64bk5"] Dec 04 09:56:33 crc kubenswrapper[4693]: I1204 09:56:33.579680 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ce063bd-35f4-4c1b-a57e-3ef900d69d35-utilities\") pod \"redhat-operators-64bk5\" (UID: \"5ce063bd-35f4-4c1b-a57e-3ef900d69d35\") " pod="openshift-marketplace/redhat-operators-64bk5" Dec 04 09:56:33 crc kubenswrapper[4693]: I1204 09:56:33.579732 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ce063bd-35f4-4c1b-a57e-3ef900d69d35-catalog-content\") pod \"redhat-operators-64bk5\" (UID: \"5ce063bd-35f4-4c1b-a57e-3ef900d69d35\") " pod="openshift-marketplace/redhat-operators-64bk5" Dec 04 09:56:33 crc kubenswrapper[4693]: I1204 09:56:33.579767 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv86w\" (UniqueName: \"kubernetes.io/projected/5ce063bd-35f4-4c1b-a57e-3ef900d69d35-kube-api-access-cv86w\") pod \"redhat-operators-64bk5\" (UID: \"5ce063bd-35f4-4c1b-a57e-3ef900d69d35\") " pod="openshift-marketplace/redhat-operators-64bk5" Dec 04 09:56:33 crc kubenswrapper[4693]: I1204 09:56:33.681092 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ce063bd-35f4-4c1b-a57e-3ef900d69d35-utilities\") pod \"redhat-operators-64bk5\" (UID: \"5ce063bd-35f4-4c1b-a57e-3ef900d69d35\") " pod="openshift-marketplace/redhat-operators-64bk5" Dec 04 09:56:33 crc kubenswrapper[4693]: I1204 09:56:33.681147 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ce063bd-35f4-4c1b-a57e-3ef900d69d35-catalog-content\") pod \"redhat-operators-64bk5\" (UID: \"5ce063bd-35f4-4c1b-a57e-3ef900d69d35\") " pod="openshift-marketplace/redhat-operators-64bk5" Dec 04 09:56:33 crc kubenswrapper[4693]: I1204 09:56:33.681177 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv86w\" (UniqueName: \"kubernetes.io/projected/5ce063bd-35f4-4c1b-a57e-3ef900d69d35-kube-api-access-cv86w\") pod \"redhat-operators-64bk5\" (UID: \"5ce063bd-35f4-4c1b-a57e-3ef900d69d35\") " pod="openshift-marketplace/redhat-operators-64bk5" Dec 04 09:56:33 crc kubenswrapper[4693]: I1204 09:56:33.681826 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ce063bd-35f4-4c1b-a57e-3ef900d69d35-utilities\") pod \"redhat-operators-64bk5\" (UID: \"5ce063bd-35f4-4c1b-a57e-3ef900d69d35\") " pod="openshift-marketplace/redhat-operators-64bk5" Dec 04 09:56:33 crc kubenswrapper[4693]: I1204 09:56:33.681930 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ce063bd-35f4-4c1b-a57e-3ef900d69d35-catalog-content\") pod \"redhat-operators-64bk5\" (UID: \"5ce063bd-35f4-4c1b-a57e-3ef900d69d35\") " pod="openshift-marketplace/redhat-operators-64bk5" Dec 04 09:56:33 crc kubenswrapper[4693]: I1204 09:56:33.703634 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv86w\" (UniqueName: \"kubernetes.io/projected/5ce063bd-35f4-4c1b-a57e-3ef900d69d35-kube-api-access-cv86w\") pod \"redhat-operators-64bk5\" (UID: \"5ce063bd-35f4-4c1b-a57e-3ef900d69d35\") " pod="openshift-marketplace/redhat-operators-64bk5" Dec 04 09:56:33 crc kubenswrapper[4693]: I1204 09:56:33.754151 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64bk5" Dec 04 09:56:34 crc kubenswrapper[4693]: I1204 09:56:34.143566 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-64bk5"] Dec 04 09:56:34 crc kubenswrapper[4693]: I1204 09:56:34.480896 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64bk5" event={"ID":"5ce063bd-35f4-4c1b-a57e-3ef900d69d35","Type":"ContainerStarted","Data":"edeaddb7aa1170a5571726b2bd644a7bcadfb8e0bf07ea3003b1c6d9e0d212e3"} Dec 04 09:56:35 crc kubenswrapper[4693]: I1204 09:56:35.490848 4693 generic.go:334] "Generic (PLEG): container finished" podID="5ce063bd-35f4-4c1b-a57e-3ef900d69d35" containerID="0b11378722aff666c08da23858717872df7613963ba1d4e7eacc5157a8cb615e" exitCode=0 Dec 04 09:56:35 crc kubenswrapper[4693]: I1204 09:56:35.491010 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64bk5" event={"ID":"5ce063bd-35f4-4c1b-a57e-3ef900d69d35","Type":"ContainerDied","Data":"0b11378722aff666c08da23858717872df7613963ba1d4e7eacc5157a8cb615e"} Dec 04 09:56:36 crc kubenswrapper[4693]: I1204 09:56:36.499763 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64bk5" event={"ID":"5ce063bd-35f4-4c1b-a57e-3ef900d69d35","Type":"ContainerStarted","Data":"0e3f76ae424205423d4250b507be2a851ae117d01ae6db6b2b09b9ab47cd31c2"} Dec 04 09:56:37 crc kubenswrapper[4693]: I1204 09:56:37.515426 4693 generic.go:334] "Generic (PLEG): container finished" podID="5ce063bd-35f4-4c1b-a57e-3ef900d69d35" containerID="0e3f76ae424205423d4250b507be2a851ae117d01ae6db6b2b09b9ab47cd31c2" exitCode=0 Dec 04 09:56:37 crc kubenswrapper[4693]: I1204 09:56:37.515521 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64bk5" event={"ID":"5ce063bd-35f4-4c1b-a57e-3ef900d69d35","Type":"ContainerDied","Data":"0e3f76ae424205423d4250b507be2a851ae117d01ae6db6b2b09b9ab47cd31c2"} Dec 04 09:56:38 crc kubenswrapper[4693]: I1204 09:56:38.527210 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64bk5" event={"ID":"5ce063bd-35f4-4c1b-a57e-3ef900d69d35","Type":"ContainerStarted","Data":"d7ea915338d68436107f05009df0e0c7c340d5e7db2e4b7fb69eecb24d44c4ee"} Dec 04 09:56:38 crc kubenswrapper[4693]: I1204 09:56:38.542226 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-64bk5" podStartSLOduration=2.9125227799999998 podStartE2EDuration="5.542211176s" podCreationTimestamp="2025-12-04 09:56:33 +0000 UTC" firstStartedPulling="2025-12-04 09:56:35.496285887 +0000 UTC m=+841.393879680" lastFinishedPulling="2025-12-04 09:56:38.125974323 +0000 UTC m=+844.023568076" observedRunningTime="2025-12-04 09:56:38.542108333 +0000 UTC m=+844.439702086" watchObservedRunningTime="2025-12-04 09:56:38.542211176 +0000 UTC m=+844.439804929" Dec 04 09:56:43 crc kubenswrapper[4693]: I1204 09:56:43.754580 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-64bk5" Dec 04 09:56:43 crc kubenswrapper[4693]: I1204 09:56:43.755314 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-64bk5" Dec 04 09:56:44 crc kubenswrapper[4693]: I1204 09:56:44.811001 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-64bk5" podUID="5ce063bd-35f4-4c1b-a57e-3ef900d69d35" containerName="registry-server" probeResult="failure" output=< Dec 04 09:56:44 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Dec 04 09:56:44 crc kubenswrapper[4693]: > Dec 04 09:56:45 crc kubenswrapper[4693]: I1204 09:56:45.820954 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf"] Dec 04 09:56:45 crc kubenswrapper[4693]: I1204 09:56:45.822436 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf" Dec 04 09:56:45 crc kubenswrapper[4693]: I1204 09:56:45.824380 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 04 09:56:45 crc kubenswrapper[4693]: I1204 09:56:45.837321 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf"] Dec 04 09:56:45 crc kubenswrapper[4693]: I1204 09:56:45.941201 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2ff9719-16f5-410e-af26-8dadc807ca7e-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf\" (UID: \"e2ff9719-16f5-410e-af26-8dadc807ca7e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf" Dec 04 09:56:45 crc kubenswrapper[4693]: I1204 09:56:45.941268 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4l59\" (UniqueName: \"kubernetes.io/projected/e2ff9719-16f5-410e-af26-8dadc807ca7e-kube-api-access-n4l59\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf\" (UID: \"e2ff9719-16f5-410e-af26-8dadc807ca7e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf" Dec 04 09:56:45 crc kubenswrapper[4693]: I1204 09:56:45.941419 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2ff9719-16f5-410e-af26-8dadc807ca7e-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf\" (UID: \"e2ff9719-16f5-410e-af26-8dadc807ca7e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf" Dec 04 09:56:46 crc kubenswrapper[4693]: I1204 09:56:46.042619 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2ff9719-16f5-410e-af26-8dadc807ca7e-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf\" (UID: \"e2ff9719-16f5-410e-af26-8dadc807ca7e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf" Dec 04 09:56:46 crc kubenswrapper[4693]: I1204 09:56:46.042717 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4l59\" (UniqueName: \"kubernetes.io/projected/e2ff9719-16f5-410e-af26-8dadc807ca7e-kube-api-access-n4l59\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf\" (UID: \"e2ff9719-16f5-410e-af26-8dadc807ca7e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf" Dec 04 09:56:46 crc kubenswrapper[4693]: I1204 09:56:46.042814 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2ff9719-16f5-410e-af26-8dadc807ca7e-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf\" (UID: \"e2ff9719-16f5-410e-af26-8dadc807ca7e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf" Dec 04 09:56:46 crc kubenswrapper[4693]: I1204 09:56:46.043620 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2ff9719-16f5-410e-af26-8dadc807ca7e-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf\" (UID: \"e2ff9719-16f5-410e-af26-8dadc807ca7e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf" Dec 04 09:56:46 crc kubenswrapper[4693]: I1204 09:56:46.043449 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2ff9719-16f5-410e-af26-8dadc807ca7e-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf\" (UID: \"e2ff9719-16f5-410e-af26-8dadc807ca7e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf" Dec 04 09:56:46 crc kubenswrapper[4693]: I1204 09:56:46.065844 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4l59\" (UniqueName: \"kubernetes.io/projected/e2ff9719-16f5-410e-af26-8dadc807ca7e-kube-api-access-n4l59\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf\" (UID: \"e2ff9719-16f5-410e-af26-8dadc807ca7e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf" Dec 04 09:56:46 crc kubenswrapper[4693]: I1204 09:56:46.139600 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf" Dec 04 09:56:46 crc kubenswrapper[4693]: I1204 09:56:46.559825 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf"] Dec 04 09:56:46 crc kubenswrapper[4693]: W1204 09:56:46.570207 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2ff9719_16f5_410e_af26_8dadc807ca7e.slice/crio-ec10136b00dac1bf339b6a97bd2acd44144013b912da8598a672c9b353f505a9 WatchSource:0}: Error finding container ec10136b00dac1bf339b6a97bd2acd44144013b912da8598a672c9b353f505a9: Status 404 returned error can't find the container with id ec10136b00dac1bf339b6a97bd2acd44144013b912da8598a672c9b353f505a9 Dec 04 09:56:47 crc kubenswrapper[4693]: I1204 09:56:47.572713 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf" event={"ID":"e2ff9719-16f5-410e-af26-8dadc807ca7e","Type":"ContainerStarted","Data":"33bc0bc8c427bbf26b4721f620bd2d927f5544e25e04a2c931863f92c3814aa8"} Dec 04 09:56:47 crc kubenswrapper[4693]: I1204 09:56:47.573111 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf" event={"ID":"e2ff9719-16f5-410e-af26-8dadc807ca7e","Type":"ContainerStarted","Data":"ec10136b00dac1bf339b6a97bd2acd44144013b912da8598a672c9b353f505a9"} Dec 04 09:56:49 crc kubenswrapper[4693]: I1204 09:56:49.587491 4693 generic.go:334] "Generic (PLEG): container finished" podID="e2ff9719-16f5-410e-af26-8dadc807ca7e" containerID="33bc0bc8c427bbf26b4721f620bd2d927f5544e25e04a2c931863f92c3814aa8" exitCode=0 Dec 04 09:56:49 crc kubenswrapper[4693]: I1204 09:56:49.587535 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf" event={"ID":"e2ff9719-16f5-410e-af26-8dadc807ca7e","Type":"ContainerDied","Data":"33bc0bc8c427bbf26b4721f620bd2d927f5544e25e04a2c931863f92c3814aa8"} Dec 04 09:56:51 crc kubenswrapper[4693]: I1204 09:56:51.600890 4693 generic.go:334] "Generic (PLEG): container finished" podID="e2ff9719-16f5-410e-af26-8dadc807ca7e" containerID="06e2672046b00f3ee7155634c11909b70123a56ede98bd856fd46c40f82365f2" exitCode=0 Dec 04 09:56:51 crc kubenswrapper[4693]: I1204 09:56:51.601541 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf" event={"ID":"e2ff9719-16f5-410e-af26-8dadc807ca7e","Type":"ContainerDied","Data":"06e2672046b00f3ee7155634c11909b70123a56ede98bd856fd46c40f82365f2"} Dec 04 09:56:52 crc kubenswrapper[4693]: I1204 09:56:52.273546 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:56:52 crc kubenswrapper[4693]: I1204 09:56:52.273645 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:56:52 crc kubenswrapper[4693]: I1204 09:56:52.273717 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 09:56:52 crc kubenswrapper[4693]: I1204 09:56:52.274724 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c4f05ea0deb5052be910dd2a2555e8c09134b71b724016170b902ec2aaa9b89"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 09:56:52 crc kubenswrapper[4693]: I1204 09:56:52.274873 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://1c4f05ea0deb5052be910dd2a2555e8c09134b71b724016170b902ec2aaa9b89" gracePeriod=600 Dec 04 09:56:52 crc kubenswrapper[4693]: I1204 09:56:52.611029 4693 generic.go:334] "Generic (PLEG): container finished" podID="e2ff9719-16f5-410e-af26-8dadc807ca7e" containerID="f2abff4a409ba4f1b70c3fa3a60bde9fa4825bdc200f15a05f6358a3b1ad29ac" exitCode=0 Dec 04 09:56:52 crc kubenswrapper[4693]: I1204 09:56:52.611080 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf" event={"ID":"e2ff9719-16f5-410e-af26-8dadc807ca7e","Type":"ContainerDied","Data":"f2abff4a409ba4f1b70c3fa3a60bde9fa4825bdc200f15a05f6358a3b1ad29ac"} Dec 04 09:56:53 crc kubenswrapper[4693]: I1204 09:56:53.622622 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="1c4f05ea0deb5052be910dd2a2555e8c09134b71b724016170b902ec2aaa9b89" exitCode=0 Dec 04 09:56:53 crc kubenswrapper[4693]: I1204 09:56:53.622675 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"1c4f05ea0deb5052be910dd2a2555e8c09134b71b724016170b902ec2aaa9b89"} Dec 04 09:56:53 crc kubenswrapper[4693]: I1204 09:56:53.623032 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"fa8175e0c93e033e33e0a667d8b04220a3901467b64921658c170a933445969a"} Dec 04 09:56:53 crc kubenswrapper[4693]: I1204 09:56:53.623059 4693 scope.go:117] "RemoveContainer" containerID="6ce848f5ee271fc4a0487df8df9f65b5687a848bd213275ce74ef26bcb2100f7" Dec 04 09:56:53 crc kubenswrapper[4693]: I1204 09:56:53.787725 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m6xfz"] Dec 04 09:56:53 crc kubenswrapper[4693]: I1204 09:56:53.789940 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m6xfz" Dec 04 09:56:53 crc kubenswrapper[4693]: I1204 09:56:53.808214 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m6xfz"] Dec 04 09:56:53 crc kubenswrapper[4693]: I1204 09:56:53.843149 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-64bk5" Dec 04 09:56:53 crc kubenswrapper[4693]: I1204 09:56:53.894407 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-64bk5" Dec 04 09:56:53 crc kubenswrapper[4693]: I1204 09:56:53.945051 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf" Dec 04 09:56:53 crc kubenswrapper[4693]: I1204 09:56:53.965818 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740e10f0-dfaf-4992-8884-88c6957f9181-utilities\") pod \"certified-operators-m6xfz\" (UID: \"740e10f0-dfaf-4992-8884-88c6957f9181\") " pod="openshift-marketplace/certified-operators-m6xfz" Dec 04 09:56:53 crc kubenswrapper[4693]: I1204 09:56:53.965866 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5ww7\" (UniqueName: \"kubernetes.io/projected/740e10f0-dfaf-4992-8884-88c6957f9181-kube-api-access-n5ww7\") pod \"certified-operators-m6xfz\" (UID: \"740e10f0-dfaf-4992-8884-88c6957f9181\") " pod="openshift-marketplace/certified-operators-m6xfz" Dec 04 09:56:53 crc kubenswrapper[4693]: I1204 09:56:53.965886 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740e10f0-dfaf-4992-8884-88c6957f9181-catalog-content\") pod \"certified-operators-m6xfz\" (UID: \"740e10f0-dfaf-4992-8884-88c6957f9181\") " pod="openshift-marketplace/certified-operators-m6xfz" Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.067250 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4l59\" (UniqueName: \"kubernetes.io/projected/e2ff9719-16f5-410e-af26-8dadc807ca7e-kube-api-access-n4l59\") pod \"e2ff9719-16f5-410e-af26-8dadc807ca7e\" (UID: \"e2ff9719-16f5-410e-af26-8dadc807ca7e\") " Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.067309 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2ff9719-16f5-410e-af26-8dadc807ca7e-bundle\") pod \"e2ff9719-16f5-410e-af26-8dadc807ca7e\" (UID: \"e2ff9719-16f5-410e-af26-8dadc807ca7e\") " Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.067351 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2ff9719-16f5-410e-af26-8dadc807ca7e-util\") pod \"e2ff9719-16f5-410e-af26-8dadc807ca7e\" (UID: \"e2ff9719-16f5-410e-af26-8dadc807ca7e\") " Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.067557 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740e10f0-dfaf-4992-8884-88c6957f9181-utilities\") pod \"certified-operators-m6xfz\" (UID: \"740e10f0-dfaf-4992-8884-88c6957f9181\") " pod="openshift-marketplace/certified-operators-m6xfz" Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.067587 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740e10f0-dfaf-4992-8884-88c6957f9181-catalog-content\") pod \"certified-operators-m6xfz\" (UID: \"740e10f0-dfaf-4992-8884-88c6957f9181\") " pod="openshift-marketplace/certified-operators-m6xfz" Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.067637 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5ww7\" (UniqueName: \"kubernetes.io/projected/740e10f0-dfaf-4992-8884-88c6957f9181-kube-api-access-n5ww7\") pod \"certified-operators-m6xfz\" (UID: \"740e10f0-dfaf-4992-8884-88c6957f9181\") " pod="openshift-marketplace/certified-operators-m6xfz" Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.067928 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2ff9719-16f5-410e-af26-8dadc807ca7e-bundle" (OuterVolumeSpecName: "bundle") pod "e2ff9719-16f5-410e-af26-8dadc807ca7e" (UID: "e2ff9719-16f5-410e-af26-8dadc807ca7e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.068432 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740e10f0-dfaf-4992-8884-88c6957f9181-utilities\") pod \"certified-operators-m6xfz\" (UID: \"740e10f0-dfaf-4992-8884-88c6957f9181\") " pod="openshift-marketplace/certified-operators-m6xfz" Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.068572 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740e10f0-dfaf-4992-8884-88c6957f9181-catalog-content\") pod \"certified-operators-m6xfz\" (UID: \"740e10f0-dfaf-4992-8884-88c6957f9181\") " pod="openshift-marketplace/certified-operators-m6xfz" Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.081609 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ff9719-16f5-410e-af26-8dadc807ca7e-kube-api-access-n4l59" (OuterVolumeSpecName: "kube-api-access-n4l59") pod "e2ff9719-16f5-410e-af26-8dadc807ca7e" (UID: "e2ff9719-16f5-410e-af26-8dadc807ca7e"). InnerVolumeSpecName "kube-api-access-n4l59". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.082167 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2ff9719-16f5-410e-af26-8dadc807ca7e-util" (OuterVolumeSpecName: "util") pod "e2ff9719-16f5-410e-af26-8dadc807ca7e" (UID: "e2ff9719-16f5-410e-af26-8dadc807ca7e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.087319 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5ww7\" (UniqueName: \"kubernetes.io/projected/740e10f0-dfaf-4992-8884-88c6957f9181-kube-api-access-n5ww7\") pod \"certified-operators-m6xfz\" (UID: \"740e10f0-dfaf-4992-8884-88c6957f9181\") " pod="openshift-marketplace/certified-operators-m6xfz" Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.141115 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m6xfz" Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.169092 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4l59\" (UniqueName: \"kubernetes.io/projected/e2ff9719-16f5-410e-af26-8dadc807ca7e-kube-api-access-n4l59\") on node \"crc\" DevicePath \"\"" Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.169550 4693 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2ff9719-16f5-410e-af26-8dadc807ca7e-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.169563 4693 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2ff9719-16f5-410e-af26-8dadc807ca7e-util\") on node \"crc\" DevicePath \"\"" Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.373431 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-64bk5"] Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.568165 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m6xfz"] Dec 04 09:56:54 crc kubenswrapper[4693]: W1204 09:56:54.571072 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod740e10f0_dfaf_4992_8884_88c6957f9181.slice/crio-98b751782ae47b4a3dececa8dc428f1d9988b7faf218b985434b4ef96e989e65 WatchSource:0}: Error finding container 98b751782ae47b4a3dececa8dc428f1d9988b7faf218b985434b4ef96e989e65: Status 404 returned error can't find the container with id 98b751782ae47b4a3dececa8dc428f1d9988b7faf218b985434b4ef96e989e65 Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.633961 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6xfz" event={"ID":"740e10f0-dfaf-4992-8884-88c6957f9181","Type":"ContainerStarted","Data":"98b751782ae47b4a3dececa8dc428f1d9988b7faf218b985434b4ef96e989e65"} Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.637432 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf" Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.637645 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf" event={"ID":"e2ff9719-16f5-410e-af26-8dadc807ca7e","Type":"ContainerDied","Data":"ec10136b00dac1bf339b6a97bd2acd44144013b912da8598a672c9b353f505a9"} Dec 04 09:56:54 crc kubenswrapper[4693]: I1204 09:56:54.637694 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec10136b00dac1bf339b6a97bd2acd44144013b912da8598a672c9b353f505a9" Dec 04 09:56:55 crc kubenswrapper[4693]: I1204 09:56:55.645241 4693 generic.go:334] "Generic (PLEG): container finished" podID="740e10f0-dfaf-4992-8884-88c6957f9181" containerID="1f13a148beff82396853407acee964e012971565562e175c9ce6ec5f37e53caa" exitCode=0 Dec 04 09:56:55 crc kubenswrapper[4693]: I1204 09:56:55.645301 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6xfz" event={"ID":"740e10f0-dfaf-4992-8884-88c6957f9181","Type":"ContainerDied","Data":"1f13a148beff82396853407acee964e012971565562e175c9ce6ec5f37e53caa"} Dec 04 09:56:55 crc kubenswrapper[4693]: I1204 09:56:55.646063 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-64bk5" podUID="5ce063bd-35f4-4c1b-a57e-3ef900d69d35" containerName="registry-server" containerID="cri-o://d7ea915338d68436107f05009df0e0c7c340d5e7db2e4b7fb69eecb24d44c4ee" gracePeriod=2 Dec 04 09:56:55 crc kubenswrapper[4693]: I1204 09:56:55.994764 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64bk5" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.095411 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ce063bd-35f4-4c1b-a57e-3ef900d69d35-utilities\") pod \"5ce063bd-35f4-4c1b-a57e-3ef900d69d35\" (UID: \"5ce063bd-35f4-4c1b-a57e-3ef900d69d35\") " Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.095452 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ce063bd-35f4-4c1b-a57e-3ef900d69d35-catalog-content\") pod \"5ce063bd-35f4-4c1b-a57e-3ef900d69d35\" (UID: \"5ce063bd-35f4-4c1b-a57e-3ef900d69d35\") " Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.095501 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv86w\" (UniqueName: \"kubernetes.io/projected/5ce063bd-35f4-4c1b-a57e-3ef900d69d35-kube-api-access-cv86w\") pod \"5ce063bd-35f4-4c1b-a57e-3ef900d69d35\" (UID: \"5ce063bd-35f4-4c1b-a57e-3ef900d69d35\") " Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.096473 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ce063bd-35f4-4c1b-a57e-3ef900d69d35-utilities" (OuterVolumeSpecName: "utilities") pod "5ce063bd-35f4-4c1b-a57e-3ef900d69d35" (UID: "5ce063bd-35f4-4c1b-a57e-3ef900d69d35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.110220 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ce063bd-35f4-4c1b-a57e-3ef900d69d35-kube-api-access-cv86w" (OuterVolumeSpecName: "kube-api-access-cv86w") pod "5ce063bd-35f4-4c1b-a57e-3ef900d69d35" (UID: "5ce063bd-35f4-4c1b-a57e-3ef900d69d35"). InnerVolumeSpecName "kube-api-access-cv86w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.196940 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ce063bd-35f4-4c1b-a57e-3ef900d69d35-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.196969 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv86w\" (UniqueName: \"kubernetes.io/projected/5ce063bd-35f4-4c1b-a57e-3ef900d69d35-kube-api-access-cv86w\") on node \"crc\" DevicePath \"\"" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.202236 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ce063bd-35f4-4c1b-a57e-3ef900d69d35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ce063bd-35f4-4c1b-a57e-3ef900d69d35" (UID: "5ce063bd-35f4-4c1b-a57e-3ef900d69d35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.298617 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ce063bd-35f4-4c1b-a57e-3ef900d69d35-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.556034 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-6q6qg"] Dec 04 09:56:56 crc kubenswrapper[4693]: E1204 09:56:56.556622 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce063bd-35f4-4c1b-a57e-3ef900d69d35" containerName="registry-server" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.556642 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce063bd-35f4-4c1b-a57e-3ef900d69d35" containerName="registry-server" Dec 04 09:56:56 crc kubenswrapper[4693]: E1204 09:56:56.556654 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ff9719-16f5-410e-af26-8dadc807ca7e" containerName="extract" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.556664 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ff9719-16f5-410e-af26-8dadc807ca7e" containerName="extract" Dec 04 09:56:56 crc kubenswrapper[4693]: E1204 09:56:56.556686 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce063bd-35f4-4c1b-a57e-3ef900d69d35" containerName="extract-utilities" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.556695 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce063bd-35f4-4c1b-a57e-3ef900d69d35" containerName="extract-utilities" Dec 04 09:56:56 crc kubenswrapper[4693]: E1204 09:56:56.556757 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ff9719-16f5-410e-af26-8dadc807ca7e" containerName="pull" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.556767 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ff9719-16f5-410e-af26-8dadc807ca7e" containerName="pull" Dec 04 09:56:56 crc kubenswrapper[4693]: E1204 09:56:56.556790 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ff9719-16f5-410e-af26-8dadc807ca7e" containerName="util" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.556798 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ff9719-16f5-410e-af26-8dadc807ca7e" containerName="util" Dec 04 09:56:56 crc kubenswrapper[4693]: E1204 09:56:56.556808 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce063bd-35f4-4c1b-a57e-3ef900d69d35" containerName="extract-content" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.556817 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce063bd-35f4-4c1b-a57e-3ef900d69d35" containerName="extract-content" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.556943 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2ff9719-16f5-410e-af26-8dadc807ca7e" containerName="extract" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.556964 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce063bd-35f4-4c1b-a57e-3ef900d69d35" containerName="registry-server" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.557418 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6q6qg" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.559870 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.560155 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.561321 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-9zlvf" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.570704 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-6q6qg"] Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.652113 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6xfz" event={"ID":"740e10f0-dfaf-4992-8884-88c6957f9181","Type":"ContainerStarted","Data":"e3c95ca6d8d6e7e9c498339d1eccc76158e2a6a8e08acd736f6d1e8b20f141da"} Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.654235 4693 generic.go:334] "Generic (PLEG): container finished" podID="5ce063bd-35f4-4c1b-a57e-3ef900d69d35" containerID="d7ea915338d68436107f05009df0e0c7c340d5e7db2e4b7fb69eecb24d44c4ee" exitCode=0 Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.654273 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64bk5" event={"ID":"5ce063bd-35f4-4c1b-a57e-3ef900d69d35","Type":"ContainerDied","Data":"d7ea915338d68436107f05009df0e0c7c340d5e7db2e4b7fb69eecb24d44c4ee"} Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.654283 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64bk5" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.654301 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64bk5" event={"ID":"5ce063bd-35f4-4c1b-a57e-3ef900d69d35","Type":"ContainerDied","Data":"edeaddb7aa1170a5571726b2bd644a7bcadfb8e0bf07ea3003b1c6d9e0d212e3"} Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.654320 4693 scope.go:117] "RemoveContainer" containerID="d7ea915338d68436107f05009df0e0c7c340d5e7db2e4b7fb69eecb24d44c4ee" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.666535 4693 scope.go:117] "RemoveContainer" containerID="0e3f76ae424205423d4250b507be2a851ae117d01ae6db6b2b09b9ab47cd31c2" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.682636 4693 scope.go:117] "RemoveContainer" containerID="0b11378722aff666c08da23858717872df7613963ba1d4e7eacc5157a8cb615e" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.701501 4693 scope.go:117] "RemoveContainer" containerID="d7ea915338d68436107f05009df0e0c7c340d5e7db2e4b7fb69eecb24d44c4ee" Dec 04 09:56:56 crc kubenswrapper[4693]: E1204 09:56:56.702152 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7ea915338d68436107f05009df0e0c7c340d5e7db2e4b7fb69eecb24d44c4ee\": container with ID starting with d7ea915338d68436107f05009df0e0c7c340d5e7db2e4b7fb69eecb24d44c4ee not found: ID does not exist" containerID="d7ea915338d68436107f05009df0e0c7c340d5e7db2e4b7fb69eecb24d44c4ee" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.702181 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7ea915338d68436107f05009df0e0c7c340d5e7db2e4b7fb69eecb24d44c4ee"} err="failed to get container status \"d7ea915338d68436107f05009df0e0c7c340d5e7db2e4b7fb69eecb24d44c4ee\": rpc error: code = NotFound desc = could not find container \"d7ea915338d68436107f05009df0e0c7c340d5e7db2e4b7fb69eecb24d44c4ee\": container with ID starting with d7ea915338d68436107f05009df0e0c7c340d5e7db2e4b7fb69eecb24d44c4ee not found: ID does not exist" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.702211 4693 scope.go:117] "RemoveContainer" containerID="0e3f76ae424205423d4250b507be2a851ae117d01ae6db6b2b09b9ab47cd31c2" Dec 04 09:56:56 crc kubenswrapper[4693]: E1204 09:56:56.702597 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e3f76ae424205423d4250b507be2a851ae117d01ae6db6b2b09b9ab47cd31c2\": container with ID starting with 0e3f76ae424205423d4250b507be2a851ae117d01ae6db6b2b09b9ab47cd31c2 not found: ID does not exist" containerID="0e3f76ae424205423d4250b507be2a851ae117d01ae6db6b2b09b9ab47cd31c2" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.702634 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e3f76ae424205423d4250b507be2a851ae117d01ae6db6b2b09b9ab47cd31c2"} err="failed to get container status \"0e3f76ae424205423d4250b507be2a851ae117d01ae6db6b2b09b9ab47cd31c2\": rpc error: code = NotFound desc = could not find container \"0e3f76ae424205423d4250b507be2a851ae117d01ae6db6b2b09b9ab47cd31c2\": container with ID starting with 0e3f76ae424205423d4250b507be2a851ae117d01ae6db6b2b09b9ab47cd31c2 not found: ID does not exist" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.702659 4693 scope.go:117] "RemoveContainer" containerID="0b11378722aff666c08da23858717872df7613963ba1d4e7eacc5157a8cb615e" Dec 04 09:56:56 crc kubenswrapper[4693]: E1204 09:56:56.702848 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b11378722aff666c08da23858717872df7613963ba1d4e7eacc5157a8cb615e\": container with ID starting with 0b11378722aff666c08da23858717872df7613963ba1d4e7eacc5157a8cb615e not found: ID does not exist" containerID="0b11378722aff666c08da23858717872df7613963ba1d4e7eacc5157a8cb615e" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.702867 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b11378722aff666c08da23858717872df7613963ba1d4e7eacc5157a8cb615e"} err="failed to get container status \"0b11378722aff666c08da23858717872df7613963ba1d4e7eacc5157a8cb615e\": rpc error: code = NotFound desc = could not find container \"0b11378722aff666c08da23858717872df7613963ba1d4e7eacc5157a8cb615e\": container with ID starting with 0b11378722aff666c08da23858717872df7613963ba1d4e7eacc5157a8cb615e not found: ID does not exist" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.702894 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-64bk5"] Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.704770 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76fqz\" (UniqueName: \"kubernetes.io/projected/92bcc38e-785b-41c0-9bf0-60db671cf71c-kube-api-access-76fqz\") pod \"nmstate-operator-5b5b58f5c8-6q6qg\" (UID: \"92bcc38e-785b-41c0-9bf0-60db671cf71c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6q6qg" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.707076 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-64bk5"] Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.806129 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76fqz\" (UniqueName: \"kubernetes.io/projected/92bcc38e-785b-41c0-9bf0-60db671cf71c-kube-api-access-76fqz\") pod \"nmstate-operator-5b5b58f5c8-6q6qg\" (UID: \"92bcc38e-785b-41c0-9bf0-60db671cf71c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6q6qg" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.822366 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76fqz\" (UniqueName: \"kubernetes.io/projected/92bcc38e-785b-41c0-9bf0-60db671cf71c-kube-api-access-76fqz\") pod \"nmstate-operator-5b5b58f5c8-6q6qg\" (UID: \"92bcc38e-785b-41c0-9bf0-60db671cf71c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6q6qg" Dec 04 09:56:56 crc kubenswrapper[4693]: I1204 09:56:56.902738 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6q6qg" Dec 04 09:56:57 crc kubenswrapper[4693]: I1204 09:56:57.118223 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-6q6qg"] Dec 04 09:56:57 crc kubenswrapper[4693]: W1204 09:56:57.127976 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92bcc38e_785b_41c0_9bf0_60db671cf71c.slice/crio-5c47ec3d454500192ab1e5a2d689ba7f79d74444668721bbbce5b7d65a19d5c7 WatchSource:0}: Error finding container 5c47ec3d454500192ab1e5a2d689ba7f79d74444668721bbbce5b7d65a19d5c7: Status 404 returned error can't find the container with id 5c47ec3d454500192ab1e5a2d689ba7f79d74444668721bbbce5b7d65a19d5c7 Dec 04 09:56:57 crc kubenswrapper[4693]: I1204 09:56:57.660501 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6q6qg" event={"ID":"92bcc38e-785b-41c0-9bf0-60db671cf71c","Type":"ContainerStarted","Data":"5c47ec3d454500192ab1e5a2d689ba7f79d74444668721bbbce5b7d65a19d5c7"} Dec 04 09:56:57 crc kubenswrapper[4693]: I1204 09:56:57.663835 4693 generic.go:334] "Generic (PLEG): container finished" podID="740e10f0-dfaf-4992-8884-88c6957f9181" containerID="e3c95ca6d8d6e7e9c498339d1eccc76158e2a6a8e08acd736f6d1e8b20f141da" exitCode=0 Dec 04 09:56:57 crc kubenswrapper[4693]: I1204 09:56:57.663914 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6xfz" event={"ID":"740e10f0-dfaf-4992-8884-88c6957f9181","Type":"ContainerDied","Data":"e3c95ca6d8d6e7e9c498339d1eccc76158e2a6a8e08acd736f6d1e8b20f141da"} Dec 04 09:56:58 crc kubenswrapper[4693]: I1204 09:56:58.468629 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ce063bd-35f4-4c1b-a57e-3ef900d69d35" path="/var/lib/kubelet/pods/5ce063bd-35f4-4c1b-a57e-3ef900d69d35/volumes" Dec 04 09:56:58 crc kubenswrapper[4693]: I1204 09:56:58.681275 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6xfz" event={"ID":"740e10f0-dfaf-4992-8884-88c6957f9181","Type":"ContainerStarted","Data":"3c38a7deef8e2a9320eb13b461c5f35764e99c7ef48412306e287ace001eee5e"} Dec 04 09:56:58 crc kubenswrapper[4693]: I1204 09:56:58.703604 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m6xfz" podStartSLOduration=3.28723182 podStartE2EDuration="5.703588681s" podCreationTimestamp="2025-12-04 09:56:53 +0000 UTC" firstStartedPulling="2025-12-04 09:56:55.647852378 +0000 UTC m=+861.545446141" lastFinishedPulling="2025-12-04 09:56:58.064209249 +0000 UTC m=+863.961803002" observedRunningTime="2025-12-04 09:56:58.699030865 +0000 UTC m=+864.596624618" watchObservedRunningTime="2025-12-04 09:56:58.703588681 +0000 UTC m=+864.601182434" Dec 04 09:56:59 crc kubenswrapper[4693]: I1204 09:56:59.686997 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6q6qg" event={"ID":"92bcc38e-785b-41c0-9bf0-60db671cf71c","Type":"ContainerStarted","Data":"37cffdb0bc0e00896ea2d09a51dfe451c679f8ea25b36176a1aa9c5e9e1a55c5"} Dec 04 09:56:59 crc kubenswrapper[4693]: I1204 09:56:59.706692 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6q6qg" podStartSLOduration=1.314832854 podStartE2EDuration="3.706673666s" podCreationTimestamp="2025-12-04 09:56:56 +0000 UTC" firstStartedPulling="2025-12-04 09:56:57.130092012 +0000 UTC m=+863.027685765" lastFinishedPulling="2025-12-04 09:56:59.521932824 +0000 UTC m=+865.419526577" observedRunningTime="2025-12-04 09:56:59.701304759 +0000 UTC m=+865.598898512" watchObservedRunningTime="2025-12-04 09:56:59.706673666 +0000 UTC m=+865.604267419" Dec 04 09:57:02 crc kubenswrapper[4693]: I1204 09:57:02.586091 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xt5z6"] Dec 04 09:57:02 crc kubenswrapper[4693]: I1204 09:57:02.589657 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xt5z6" Dec 04 09:57:02 crc kubenswrapper[4693]: I1204 09:57:02.603299 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xt5z6"] Dec 04 09:57:02 crc kubenswrapper[4693]: I1204 09:57:02.687284 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnznh\" (UniqueName: \"kubernetes.io/projected/5beedddc-0b81-470c-8ae2-b927f1fa899d-kube-api-access-nnznh\") pod \"redhat-marketplace-xt5z6\" (UID: \"5beedddc-0b81-470c-8ae2-b927f1fa899d\") " pod="openshift-marketplace/redhat-marketplace-xt5z6" Dec 04 09:57:02 crc kubenswrapper[4693]: I1204 09:57:02.687375 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5beedddc-0b81-470c-8ae2-b927f1fa899d-utilities\") pod \"redhat-marketplace-xt5z6\" (UID: \"5beedddc-0b81-470c-8ae2-b927f1fa899d\") " pod="openshift-marketplace/redhat-marketplace-xt5z6" Dec 04 09:57:02 crc kubenswrapper[4693]: I1204 09:57:02.687402 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5beedddc-0b81-470c-8ae2-b927f1fa899d-catalog-content\") pod \"redhat-marketplace-xt5z6\" (UID: \"5beedddc-0b81-470c-8ae2-b927f1fa899d\") " pod="openshift-marketplace/redhat-marketplace-xt5z6" Dec 04 09:57:02 crc kubenswrapper[4693]: I1204 09:57:02.788347 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnznh\" (UniqueName: \"kubernetes.io/projected/5beedddc-0b81-470c-8ae2-b927f1fa899d-kube-api-access-nnznh\") pod \"redhat-marketplace-xt5z6\" (UID: \"5beedddc-0b81-470c-8ae2-b927f1fa899d\") " pod="openshift-marketplace/redhat-marketplace-xt5z6" Dec 04 09:57:02 crc kubenswrapper[4693]: I1204 09:57:02.788416 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5beedddc-0b81-470c-8ae2-b927f1fa899d-utilities\") pod \"redhat-marketplace-xt5z6\" (UID: \"5beedddc-0b81-470c-8ae2-b927f1fa899d\") " pod="openshift-marketplace/redhat-marketplace-xt5z6" Dec 04 09:57:02 crc kubenswrapper[4693]: I1204 09:57:02.788449 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5beedddc-0b81-470c-8ae2-b927f1fa899d-catalog-content\") pod \"redhat-marketplace-xt5z6\" (UID: \"5beedddc-0b81-470c-8ae2-b927f1fa899d\") " pod="openshift-marketplace/redhat-marketplace-xt5z6" Dec 04 09:57:02 crc kubenswrapper[4693]: I1204 09:57:02.788954 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5beedddc-0b81-470c-8ae2-b927f1fa899d-catalog-content\") pod \"redhat-marketplace-xt5z6\" (UID: \"5beedddc-0b81-470c-8ae2-b927f1fa899d\") " pod="openshift-marketplace/redhat-marketplace-xt5z6" Dec 04 09:57:02 crc kubenswrapper[4693]: I1204 09:57:02.789050 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5beedddc-0b81-470c-8ae2-b927f1fa899d-utilities\") pod \"redhat-marketplace-xt5z6\" (UID: \"5beedddc-0b81-470c-8ae2-b927f1fa899d\") " pod="openshift-marketplace/redhat-marketplace-xt5z6" Dec 04 09:57:02 crc kubenswrapper[4693]: I1204 09:57:02.814536 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnznh\" (UniqueName: \"kubernetes.io/projected/5beedddc-0b81-470c-8ae2-b927f1fa899d-kube-api-access-nnznh\") pod \"redhat-marketplace-xt5z6\" (UID: \"5beedddc-0b81-470c-8ae2-b927f1fa899d\") " pod="openshift-marketplace/redhat-marketplace-xt5z6" Dec 04 09:57:02 crc kubenswrapper[4693]: I1204 09:57:02.923288 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xt5z6" Dec 04 09:57:03 crc kubenswrapper[4693]: I1204 09:57:03.333620 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xt5z6"] Dec 04 09:57:03 crc kubenswrapper[4693]: I1204 09:57:03.708840 4693 generic.go:334] "Generic (PLEG): container finished" podID="5beedddc-0b81-470c-8ae2-b927f1fa899d" containerID="14341d98bbbce5d0160ab04e17996844a353d7b45dd6bdfdd8796c647c688770" exitCode=0 Dec 04 09:57:03 crc kubenswrapper[4693]: I1204 09:57:03.708937 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt5z6" event={"ID":"5beedddc-0b81-470c-8ae2-b927f1fa899d","Type":"ContainerDied","Data":"14341d98bbbce5d0160ab04e17996844a353d7b45dd6bdfdd8796c647c688770"} Dec 04 09:57:03 crc kubenswrapper[4693]: I1204 09:57:03.709490 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt5z6" event={"ID":"5beedddc-0b81-470c-8ae2-b927f1fa899d","Type":"ContainerStarted","Data":"ea0f8267997777f199f97810af6844e10e28df5198bb34d7ff1ad5fa906d13b8"} Dec 04 09:57:04 crc kubenswrapper[4693]: I1204 09:57:04.141523 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m6xfz" Dec 04 09:57:04 crc kubenswrapper[4693]: I1204 09:57:04.141664 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m6xfz" Dec 04 09:57:04 crc kubenswrapper[4693]: I1204 09:57:04.200470 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m6xfz" Dec 04 09:57:04 crc kubenswrapper[4693]: I1204 09:57:04.720169 4693 generic.go:334] "Generic (PLEG): container finished" podID="5beedddc-0b81-470c-8ae2-b927f1fa899d" containerID="73425ac4bc86bdd090071f6922b25a26ab3908c42578def41069033dff286eee" exitCode=0 Dec 04 09:57:04 crc kubenswrapper[4693]: I1204 09:57:04.720417 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt5z6" event={"ID":"5beedddc-0b81-470c-8ae2-b927f1fa899d","Type":"ContainerDied","Data":"73425ac4bc86bdd090071f6922b25a26ab3908c42578def41069033dff286eee"} Dec 04 09:57:04 crc kubenswrapper[4693]: I1204 09:57:04.786575 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m6xfz" Dec 04 09:57:05 crc kubenswrapper[4693]: I1204 09:57:05.727012 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt5z6" event={"ID":"5beedddc-0b81-470c-8ae2-b927f1fa899d","Type":"ContainerStarted","Data":"6bcee1c4d7d05c803f92393d5850ebb47957521dcb23de8c05593f7747e0e51c"} Dec 04 09:57:05 crc kubenswrapper[4693]: I1204 09:57:05.755924 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xt5z6" podStartSLOduration=2.303972117 podStartE2EDuration="3.755896272s" podCreationTimestamp="2025-12-04 09:57:02 +0000 UTC" firstStartedPulling="2025-12-04 09:57:03.710380852 +0000 UTC m=+869.607974615" lastFinishedPulling="2025-12-04 09:57:05.162305017 +0000 UTC m=+871.059898770" observedRunningTime="2025-12-04 09:57:05.753256149 +0000 UTC m=+871.650849942" watchObservedRunningTime="2025-12-04 09:57:05.755896272 +0000 UTC m=+871.653490065" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.413070 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-qvq5w"] Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.414428 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qvq5w" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.416216 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-8z9hw" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.425199 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w2kzc"] Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.426034 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w2kzc" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.431516 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.450809 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w2kzc"] Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.454016 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-897g8"] Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.454696 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-897g8" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.481225 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-qvq5w"] Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.538063 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a3bf9b17-2a64-4697-b916-16b8c14f4bff-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-w2kzc\" (UID: \"a3bf9b17-2a64-4697-b916-16b8c14f4bff\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w2kzc" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.538113 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdmgg\" (UniqueName: \"kubernetes.io/projected/a3bf9b17-2a64-4697-b916-16b8c14f4bff-kube-api-access-kdmgg\") pod \"nmstate-webhook-5f6d4c5ccb-w2kzc\" (UID: \"a3bf9b17-2a64-4697-b916-16b8c14f4bff\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w2kzc" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.538136 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmt89\" (UniqueName: \"kubernetes.io/projected/5054f7b0-a486-497e-9514-4a1387e7f815-kube-api-access-cmt89\") pod \"nmstate-metrics-7f946cbc9-qvq5w\" (UID: \"5054f7b0-a486-497e-9514-4a1387e7f815\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qvq5w" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.546427 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6tjzs"] Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.547078 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6tjzs" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.549212 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.550363 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.550981 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mtnrc" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.567744 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6tjzs"] Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.640344 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmt89\" (UniqueName: \"kubernetes.io/projected/5054f7b0-a486-497e-9514-4a1387e7f815-kube-api-access-cmt89\") pod \"nmstate-metrics-7f946cbc9-qvq5w\" (UID: \"5054f7b0-a486-497e-9514-4a1387e7f815\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qvq5w" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.640453 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3ce4524c-acc8-42ff-b575-712649f91f33-nmstate-lock\") pod \"nmstate-handler-897g8\" (UID: \"3ce4524c-acc8-42ff-b575-712649f91f33\") " pod="openshift-nmstate/nmstate-handler-897g8" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.640598 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3ce4524c-acc8-42ff-b575-712649f91f33-dbus-socket\") pod \"nmstate-handler-897g8\" (UID: \"3ce4524c-acc8-42ff-b575-712649f91f33\") " pod="openshift-nmstate/nmstate-handler-897g8" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.642368 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a3bf9b17-2a64-4697-b916-16b8c14f4bff-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-w2kzc\" (UID: \"a3bf9b17-2a64-4697-b916-16b8c14f4bff\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w2kzc" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.643260 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3ce4524c-acc8-42ff-b575-712649f91f33-ovs-socket\") pod \"nmstate-handler-897g8\" (UID: \"3ce4524c-acc8-42ff-b575-712649f91f33\") " pod="openshift-nmstate/nmstate-handler-897g8" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.643398 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdmgg\" (UniqueName: \"kubernetes.io/projected/a3bf9b17-2a64-4697-b916-16b8c14f4bff-kube-api-access-kdmgg\") pod \"nmstate-webhook-5f6d4c5ccb-w2kzc\" (UID: \"a3bf9b17-2a64-4697-b916-16b8c14f4bff\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w2kzc" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.643432 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqzdc\" (UniqueName: \"kubernetes.io/projected/3ce4524c-acc8-42ff-b575-712649f91f33-kube-api-access-fqzdc\") pod \"nmstate-handler-897g8\" (UID: \"3ce4524c-acc8-42ff-b575-712649f91f33\") " pod="openshift-nmstate/nmstate-handler-897g8" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.649756 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a3bf9b17-2a64-4697-b916-16b8c14f4bff-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-w2kzc\" (UID: \"a3bf9b17-2a64-4697-b916-16b8c14f4bff\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w2kzc" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.666123 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdmgg\" (UniqueName: \"kubernetes.io/projected/a3bf9b17-2a64-4697-b916-16b8c14f4bff-kube-api-access-kdmgg\") pod \"nmstate-webhook-5f6d4c5ccb-w2kzc\" (UID: \"a3bf9b17-2a64-4697-b916-16b8c14f4bff\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w2kzc" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.683300 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmt89\" (UniqueName: \"kubernetes.io/projected/5054f7b0-a486-497e-9514-4a1387e7f815-kube-api-access-cmt89\") pod \"nmstate-metrics-7f946cbc9-qvq5w\" (UID: \"5054f7b0-a486-497e-9514-4a1387e7f815\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qvq5w" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.734660 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qvq5w" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.744830 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3ce4524c-acc8-42ff-b575-712649f91f33-dbus-socket\") pod \"nmstate-handler-897g8\" (UID: \"3ce4524c-acc8-42ff-b575-712649f91f33\") " pod="openshift-nmstate/nmstate-handler-897g8" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.744894 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/04395eb3-d17a-45e1-8c76-5cef70217095-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-6tjzs\" (UID: \"04395eb3-d17a-45e1-8c76-5cef70217095\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6tjzs" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.744939 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w2kzc" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.745230 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3ce4524c-acc8-42ff-b575-712649f91f33-dbus-socket\") pod \"nmstate-handler-897g8\" (UID: \"3ce4524c-acc8-42ff-b575-712649f91f33\") " pod="openshift-nmstate/nmstate-handler-897g8" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.744945 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8jfx\" (UniqueName: \"kubernetes.io/projected/04395eb3-d17a-45e1-8c76-5cef70217095-kube-api-access-l8jfx\") pod \"nmstate-console-plugin-7fbb5f6569-6tjzs\" (UID: \"04395eb3-d17a-45e1-8c76-5cef70217095\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6tjzs" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.745477 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3ce4524c-acc8-42ff-b575-712649f91f33-ovs-socket\") pod \"nmstate-handler-897g8\" (UID: \"3ce4524c-acc8-42ff-b575-712649f91f33\") " pod="openshift-nmstate/nmstate-handler-897g8" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.745521 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3ce4524c-acc8-42ff-b575-712649f91f33-ovs-socket\") pod \"nmstate-handler-897g8\" (UID: \"3ce4524c-acc8-42ff-b575-712649f91f33\") " pod="openshift-nmstate/nmstate-handler-897g8" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.745546 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqzdc\" (UniqueName: \"kubernetes.io/projected/3ce4524c-acc8-42ff-b575-712649f91f33-kube-api-access-fqzdc\") pod \"nmstate-handler-897g8\" (UID: \"3ce4524c-acc8-42ff-b575-712649f91f33\") " pod="openshift-nmstate/nmstate-handler-897g8" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.745596 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3ce4524c-acc8-42ff-b575-712649f91f33-nmstate-lock\") pod \"nmstate-handler-897g8\" (UID: \"3ce4524c-acc8-42ff-b575-712649f91f33\") " pod="openshift-nmstate/nmstate-handler-897g8" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.745656 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/04395eb3-d17a-45e1-8c76-5cef70217095-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-6tjzs\" (UID: \"04395eb3-d17a-45e1-8c76-5cef70217095\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6tjzs" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.745757 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3ce4524c-acc8-42ff-b575-712649f91f33-nmstate-lock\") pod \"nmstate-handler-897g8\" (UID: \"3ce4524c-acc8-42ff-b575-712649f91f33\") " pod="openshift-nmstate/nmstate-handler-897g8" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.772999 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqzdc\" (UniqueName: \"kubernetes.io/projected/3ce4524c-acc8-42ff-b575-712649f91f33-kube-api-access-fqzdc\") pod \"nmstate-handler-897g8\" (UID: \"3ce4524c-acc8-42ff-b575-712649f91f33\") " pod="openshift-nmstate/nmstate-handler-897g8" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.778714 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-897g8" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.780801 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-c6ff486f7-9ll2b"] Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.781967 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.799374 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c6ff486f7-9ll2b"] Dec 04 09:57:06 crc kubenswrapper[4693]: W1204 09:57:06.812753 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ce4524c_acc8_42ff_b575_712649f91f33.slice/crio-a7ffdd3cefa0edd2edfe5f095e98bec884d7b1afcdee8231743188743d3ef54b WatchSource:0}: Error finding container a7ffdd3cefa0edd2edfe5f095e98bec884d7b1afcdee8231743188743d3ef54b: Status 404 returned error can't find the container with id a7ffdd3cefa0edd2edfe5f095e98bec884d7b1afcdee8231743188743d3ef54b Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.846811 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/04395eb3-d17a-45e1-8c76-5cef70217095-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-6tjzs\" (UID: \"04395eb3-d17a-45e1-8c76-5cef70217095\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6tjzs" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.846875 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/04395eb3-d17a-45e1-8c76-5cef70217095-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-6tjzs\" (UID: \"04395eb3-d17a-45e1-8c76-5cef70217095\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6tjzs" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.846917 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8jfx\" (UniqueName: \"kubernetes.io/projected/04395eb3-d17a-45e1-8c76-5cef70217095-kube-api-access-l8jfx\") pod \"nmstate-console-plugin-7fbb5f6569-6tjzs\" (UID: \"04395eb3-d17a-45e1-8c76-5cef70217095\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6tjzs" Dec 04 09:57:06 crc kubenswrapper[4693]: E1204 09:57:06.847106 4693 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 04 09:57:06 crc kubenswrapper[4693]: E1204 09:57:06.847437 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04395eb3-d17a-45e1-8c76-5cef70217095-plugin-serving-cert podName:04395eb3-d17a-45e1-8c76-5cef70217095 nodeName:}" failed. No retries permitted until 2025-12-04 09:57:07.347410104 +0000 UTC m=+873.245003867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/04395eb3-d17a-45e1-8c76-5cef70217095-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-6tjzs" (UID: "04395eb3-d17a-45e1-8c76-5cef70217095") : secret "plugin-serving-cert" not found Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.848276 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/04395eb3-d17a-45e1-8c76-5cef70217095-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-6tjzs\" (UID: \"04395eb3-d17a-45e1-8c76-5cef70217095\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6tjzs" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.876536 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8jfx\" (UniqueName: \"kubernetes.io/projected/04395eb3-d17a-45e1-8c76-5cef70217095-kube-api-access-l8jfx\") pod \"nmstate-console-plugin-7fbb5f6569-6tjzs\" (UID: \"04395eb3-d17a-45e1-8c76-5cef70217095\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6tjzs" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.949377 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/228229d5-c22f-4b39-88cd-bcd4dba37168-console-serving-cert\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.949445 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/228229d5-c22f-4b39-88cd-bcd4dba37168-service-ca\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.949477 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/228229d5-c22f-4b39-88cd-bcd4dba37168-oauth-serving-cert\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.949496 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/228229d5-c22f-4b39-88cd-bcd4dba37168-console-config\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.949511 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7ghs\" (UniqueName: \"kubernetes.io/projected/228229d5-c22f-4b39-88cd-bcd4dba37168-kube-api-access-l7ghs\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.949540 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/228229d5-c22f-4b39-88cd-bcd4dba37168-trusted-ca-bundle\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:06 crc kubenswrapper[4693]: I1204 09:57:06.949566 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/228229d5-c22f-4b39-88cd-bcd4dba37168-console-oauth-config\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.051270 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/228229d5-c22f-4b39-88cd-bcd4dba37168-service-ca\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.051352 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/228229d5-c22f-4b39-88cd-bcd4dba37168-oauth-serving-cert\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.051374 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/228229d5-c22f-4b39-88cd-bcd4dba37168-console-config\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.051387 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7ghs\" (UniqueName: \"kubernetes.io/projected/228229d5-c22f-4b39-88cd-bcd4dba37168-kube-api-access-l7ghs\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.051416 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/228229d5-c22f-4b39-88cd-bcd4dba37168-trusted-ca-bundle\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.051443 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/228229d5-c22f-4b39-88cd-bcd4dba37168-console-oauth-config\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.051471 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/228229d5-c22f-4b39-88cd-bcd4dba37168-console-serving-cert\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.052452 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/228229d5-c22f-4b39-88cd-bcd4dba37168-service-ca\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.052649 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/228229d5-c22f-4b39-88cd-bcd4dba37168-oauth-serving-cert\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.052790 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/228229d5-c22f-4b39-88cd-bcd4dba37168-console-config\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.053025 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/228229d5-c22f-4b39-88cd-bcd4dba37168-trusted-ca-bundle\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.057373 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/228229d5-c22f-4b39-88cd-bcd4dba37168-console-oauth-config\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.060106 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/228229d5-c22f-4b39-88cd-bcd4dba37168-console-serving-cert\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.067251 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7ghs\" (UniqueName: \"kubernetes.io/projected/228229d5-c22f-4b39-88cd-bcd4dba37168-kube-api-access-l7ghs\") pod \"console-c6ff486f7-9ll2b\" (UID: \"228229d5-c22f-4b39-88cd-bcd4dba37168\") " pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.121160 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.260215 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-qvq5w"] Dec 04 09:57:07 crc kubenswrapper[4693]: W1204 09:57:07.269542 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5054f7b0_a486_497e_9514_4a1387e7f815.slice/crio-4ff89b0388119495d2a183379399c341b0dfb6e93428331d836ef01f894a4c46 WatchSource:0}: Error finding container 4ff89b0388119495d2a183379399c341b0dfb6e93428331d836ef01f894a4c46: Status 404 returned error can't find the container with id 4ff89b0388119495d2a183379399c341b0dfb6e93428331d836ef01f894a4c46 Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.300623 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w2kzc"] Dec 04 09:57:07 crc kubenswrapper[4693]: W1204 09:57:07.305035 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3bf9b17_2a64_4697_b916_16b8c14f4bff.slice/crio-05aae4b84bfe24cfb3d4fd9612daffcfe746fa2e76b78c72f6e14574b3ba6941 WatchSource:0}: Error finding container 05aae4b84bfe24cfb3d4fd9612daffcfe746fa2e76b78c72f6e14574b3ba6941: Status 404 returned error can't find the container with id 05aae4b84bfe24cfb3d4fd9612daffcfe746fa2e76b78c72f6e14574b3ba6941 Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.315393 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c6ff486f7-9ll2b"] Dec 04 09:57:07 crc kubenswrapper[4693]: W1204 09:57:07.320125 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod228229d5_c22f_4b39_88cd_bcd4dba37168.slice/crio-5130fbbed1691096f424e4a0cecdd3d6af7c6634cfd7d264e036441730210205 WatchSource:0}: Error finding container 5130fbbed1691096f424e4a0cecdd3d6af7c6634cfd7d264e036441730210205: Status 404 returned error can't find the container with id 5130fbbed1691096f424e4a0cecdd3d6af7c6634cfd7d264e036441730210205 Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.358486 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/04395eb3-d17a-45e1-8c76-5cef70217095-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-6tjzs\" (UID: \"04395eb3-d17a-45e1-8c76-5cef70217095\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6tjzs" Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.362960 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/04395eb3-d17a-45e1-8c76-5cef70217095-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-6tjzs\" (UID: \"04395eb3-d17a-45e1-8c76-5cef70217095\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6tjzs" Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.464111 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6tjzs" Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.669759 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6tjzs"] Dec 04 09:57:07 crc kubenswrapper[4693]: W1204 09:57:07.679170 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04395eb3_d17a_45e1_8c76_5cef70217095.slice/crio-e87a16f0efe09d316d6b78eeed2f91e3477ebb001c5586c3019d9635de0847a5 WatchSource:0}: Error finding container e87a16f0efe09d316d6b78eeed2f91e3477ebb001c5586c3019d9635de0847a5: Status 404 returned error can't find the container with id e87a16f0efe09d316d6b78eeed2f91e3477ebb001c5586c3019d9635de0847a5 Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.740270 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qvq5w" event={"ID":"5054f7b0-a486-497e-9514-4a1387e7f815","Type":"ContainerStarted","Data":"4ff89b0388119495d2a183379399c341b0dfb6e93428331d836ef01f894a4c46"} Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.741595 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-897g8" event={"ID":"3ce4524c-acc8-42ff-b575-712649f91f33","Type":"ContainerStarted","Data":"a7ffdd3cefa0edd2edfe5f095e98bec884d7b1afcdee8231743188743d3ef54b"} Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.743121 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6ff486f7-9ll2b" event={"ID":"228229d5-c22f-4b39-88cd-bcd4dba37168","Type":"ContainerStarted","Data":"5130fbbed1691096f424e4a0cecdd3d6af7c6634cfd7d264e036441730210205"} Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.744183 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w2kzc" event={"ID":"a3bf9b17-2a64-4697-b916-16b8c14f4bff","Type":"ContainerStarted","Data":"05aae4b84bfe24cfb3d4fd9612daffcfe746fa2e76b78c72f6e14574b3ba6941"} Dec 04 09:57:07 crc kubenswrapper[4693]: I1204 09:57:07.745219 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6tjzs" event={"ID":"04395eb3-d17a-45e1-8c76-5cef70217095","Type":"ContainerStarted","Data":"e87a16f0efe09d316d6b78eeed2f91e3477ebb001c5586c3019d9635de0847a5"} Dec 04 09:57:08 crc kubenswrapper[4693]: I1204 09:57:08.754199 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6ff486f7-9ll2b" event={"ID":"228229d5-c22f-4b39-88cd-bcd4dba37168","Type":"ContainerStarted","Data":"247551edd87b0a056cfbb461985c277914cf40f838cb1b2aabd7a24723fc870b"} Dec 04 09:57:08 crc kubenswrapper[4693]: I1204 09:57:08.779182 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c6ff486f7-9ll2b" podStartSLOduration=2.779157717 podStartE2EDuration="2.779157717s" podCreationTimestamp="2025-12-04 09:57:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:57:08.768637137 +0000 UTC m=+874.666230890" watchObservedRunningTime="2025-12-04 09:57:08.779157717 +0000 UTC m=+874.676751510" Dec 04 09:57:09 crc kubenswrapper[4693]: I1204 09:57:09.169891 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m6xfz"] Dec 04 09:57:09 crc kubenswrapper[4693]: I1204 09:57:09.170172 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m6xfz" podUID="740e10f0-dfaf-4992-8884-88c6957f9181" containerName="registry-server" containerID="cri-o://3c38a7deef8e2a9320eb13b461c5f35764e99c7ef48412306e287ace001eee5e" gracePeriod=2 Dec 04 09:57:09 crc kubenswrapper[4693]: I1204 09:57:09.765404 4693 generic.go:334] "Generic (PLEG): container finished" podID="740e10f0-dfaf-4992-8884-88c6957f9181" containerID="3c38a7deef8e2a9320eb13b461c5f35764e99c7ef48412306e287ace001eee5e" exitCode=0 Dec 04 09:57:09 crc kubenswrapper[4693]: I1204 09:57:09.765476 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6xfz" event={"ID":"740e10f0-dfaf-4992-8884-88c6957f9181","Type":"ContainerDied","Data":"3c38a7deef8e2a9320eb13b461c5f35764e99c7ef48412306e287ace001eee5e"} Dec 04 09:57:10 crc kubenswrapper[4693]: I1204 09:57:10.636895 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m6xfz" Dec 04 09:57:10 crc kubenswrapper[4693]: I1204 09:57:10.776722 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6xfz" event={"ID":"740e10f0-dfaf-4992-8884-88c6957f9181","Type":"ContainerDied","Data":"98b751782ae47b4a3dececa8dc428f1d9988b7faf218b985434b4ef96e989e65"} Dec 04 09:57:10 crc kubenswrapper[4693]: I1204 09:57:10.776790 4693 scope.go:117] "RemoveContainer" containerID="3c38a7deef8e2a9320eb13b461c5f35764e99c7ef48412306e287ace001eee5e" Dec 04 09:57:10 crc kubenswrapper[4693]: I1204 09:57:10.776794 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m6xfz" Dec 04 09:57:10 crc kubenswrapper[4693]: I1204 09:57:10.819006 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5ww7\" (UniqueName: \"kubernetes.io/projected/740e10f0-dfaf-4992-8884-88c6957f9181-kube-api-access-n5ww7\") pod \"740e10f0-dfaf-4992-8884-88c6957f9181\" (UID: \"740e10f0-dfaf-4992-8884-88c6957f9181\") " Dec 04 09:57:10 crc kubenswrapper[4693]: I1204 09:57:10.819170 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740e10f0-dfaf-4992-8884-88c6957f9181-catalog-content\") pod \"740e10f0-dfaf-4992-8884-88c6957f9181\" (UID: \"740e10f0-dfaf-4992-8884-88c6957f9181\") " Dec 04 09:57:10 crc kubenswrapper[4693]: I1204 09:57:10.819286 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740e10f0-dfaf-4992-8884-88c6957f9181-utilities\") pod \"740e10f0-dfaf-4992-8884-88c6957f9181\" (UID: \"740e10f0-dfaf-4992-8884-88c6957f9181\") " Dec 04 09:57:10 crc kubenswrapper[4693]: I1204 09:57:10.820484 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/740e10f0-dfaf-4992-8884-88c6957f9181-utilities" (OuterVolumeSpecName: "utilities") pod "740e10f0-dfaf-4992-8884-88c6957f9181" (UID: "740e10f0-dfaf-4992-8884-88c6957f9181"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:57:10 crc kubenswrapper[4693]: I1204 09:57:10.824709 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/740e10f0-dfaf-4992-8884-88c6957f9181-kube-api-access-n5ww7" (OuterVolumeSpecName: "kube-api-access-n5ww7") pod "740e10f0-dfaf-4992-8884-88c6957f9181" (UID: "740e10f0-dfaf-4992-8884-88c6957f9181"). InnerVolumeSpecName "kube-api-access-n5ww7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:57:10 crc kubenswrapper[4693]: I1204 09:57:10.858626 4693 scope.go:117] "RemoveContainer" containerID="e3c95ca6d8d6e7e9c498339d1eccc76158e2a6a8e08acd736f6d1e8b20f141da" Dec 04 09:57:10 crc kubenswrapper[4693]: I1204 09:57:10.880718 4693 scope.go:117] "RemoveContainer" containerID="1f13a148beff82396853407acee964e012971565562e175c9ce6ec5f37e53caa" Dec 04 09:57:10 crc kubenswrapper[4693]: I1204 09:57:10.897038 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/740e10f0-dfaf-4992-8884-88c6957f9181-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "740e10f0-dfaf-4992-8884-88c6957f9181" (UID: "740e10f0-dfaf-4992-8884-88c6957f9181"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:57:10 crc kubenswrapper[4693]: I1204 09:57:10.920798 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740e10f0-dfaf-4992-8884-88c6957f9181-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:57:10 crc kubenswrapper[4693]: I1204 09:57:10.920838 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5ww7\" (UniqueName: \"kubernetes.io/projected/740e10f0-dfaf-4992-8884-88c6957f9181-kube-api-access-n5ww7\") on node \"crc\" DevicePath \"\"" Dec 04 09:57:10 crc kubenswrapper[4693]: I1204 09:57:10.920853 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740e10f0-dfaf-4992-8884-88c6957f9181-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:57:11 crc kubenswrapper[4693]: I1204 09:57:11.117155 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m6xfz"] Dec 04 09:57:11 crc kubenswrapper[4693]: I1204 09:57:11.123268 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m6xfz"] Dec 04 09:57:11 crc kubenswrapper[4693]: I1204 09:57:11.784482 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qvq5w" event={"ID":"5054f7b0-a486-497e-9514-4a1387e7f815","Type":"ContainerStarted","Data":"32063766048e78c5a40f844a484419ec737aef62ada511c2ad2b88eabe822ac3"} Dec 04 09:57:11 crc kubenswrapper[4693]: I1204 09:57:11.787462 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-897g8" event={"ID":"3ce4524c-acc8-42ff-b575-712649f91f33","Type":"ContainerStarted","Data":"a2722a69b7a51ed23b3f877a77122f8e61a588b541b77a20a53c04c4c731bf69"} Dec 04 09:57:11 crc kubenswrapper[4693]: I1204 09:57:11.787551 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-897g8" Dec 04 09:57:11 crc kubenswrapper[4693]: I1204 09:57:11.789765 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w2kzc" event={"ID":"a3bf9b17-2a64-4697-b916-16b8c14f4bff","Type":"ContainerStarted","Data":"f690764244839e0347e08f94ce86c5fd765aaa3319dcc7b54506dc87ca7dc970"} Dec 04 09:57:11 crc kubenswrapper[4693]: I1204 09:57:11.789919 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w2kzc" Dec 04 09:57:11 crc kubenswrapper[4693]: I1204 09:57:11.793738 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6tjzs" event={"ID":"04395eb3-d17a-45e1-8c76-5cef70217095","Type":"ContainerStarted","Data":"8e12cd0e462196bdcbbc81ef39c650e9ef2ba77ef330320c00255d428f94f094"} Dec 04 09:57:11 crc kubenswrapper[4693]: I1204 09:57:11.809405 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-897g8" podStartSLOduration=1.762051213 podStartE2EDuration="5.809382655s" podCreationTimestamp="2025-12-04 09:57:06 +0000 UTC" firstStartedPulling="2025-12-04 09:57:06.818133324 +0000 UTC m=+872.715727077" lastFinishedPulling="2025-12-04 09:57:10.865464716 +0000 UTC m=+876.763058519" observedRunningTime="2025-12-04 09:57:11.808255053 +0000 UTC m=+877.705848806" watchObservedRunningTime="2025-12-04 09:57:11.809382655 +0000 UTC m=+877.706976438" Dec 04 09:57:11 crc kubenswrapper[4693]: I1204 09:57:11.821778 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6tjzs" podStartSLOduration=2.641263361 podStartE2EDuration="5.821758707s" podCreationTimestamp="2025-12-04 09:57:06 +0000 UTC" firstStartedPulling="2025-12-04 09:57:07.682645156 +0000 UTC m=+873.580238929" lastFinishedPulling="2025-12-04 09:57:10.863140482 +0000 UTC m=+876.760734275" observedRunningTime="2025-12-04 09:57:11.820956626 +0000 UTC m=+877.718550379" watchObservedRunningTime="2025-12-04 09:57:11.821758707 +0000 UTC m=+877.719352470" Dec 04 09:57:11 crc kubenswrapper[4693]: I1204 09:57:11.861644 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w2kzc" podStartSLOduration=2.310151951 podStartE2EDuration="5.861615811s" podCreationTimestamp="2025-12-04 09:57:06 +0000 UTC" firstStartedPulling="2025-12-04 09:57:07.307100785 +0000 UTC m=+873.204694538" lastFinishedPulling="2025-12-04 09:57:10.858564595 +0000 UTC m=+876.756158398" observedRunningTime="2025-12-04 09:57:11.845179476 +0000 UTC m=+877.742773259" watchObservedRunningTime="2025-12-04 09:57:11.861615811 +0000 UTC m=+877.759209574" Dec 04 09:57:12 crc kubenswrapper[4693]: I1204 09:57:12.474067 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="740e10f0-dfaf-4992-8884-88c6957f9181" path="/var/lib/kubelet/pods/740e10f0-dfaf-4992-8884-88c6957f9181/volumes" Dec 04 09:57:12 crc kubenswrapper[4693]: I1204 09:57:12.924511 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xt5z6" Dec 04 09:57:12 crc kubenswrapper[4693]: I1204 09:57:12.924556 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xt5z6" Dec 04 09:57:12 crc kubenswrapper[4693]: I1204 09:57:12.959917 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xt5z6" Dec 04 09:57:13 crc kubenswrapper[4693]: I1204 09:57:13.808027 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qvq5w" event={"ID":"5054f7b0-a486-497e-9514-4a1387e7f815","Type":"ContainerStarted","Data":"89d6d5f0330a36109825eae234ef4dab09ae80417637ec77a94d4d93d924d768"} Dec 04 09:57:13 crc kubenswrapper[4693]: I1204 09:57:13.835916 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qvq5w" podStartSLOduration=1.876616334 podStartE2EDuration="7.835895s" podCreationTimestamp="2025-12-04 09:57:06 +0000 UTC" firstStartedPulling="2025-12-04 09:57:07.274309877 +0000 UTC m=+873.171903630" lastFinishedPulling="2025-12-04 09:57:13.233588543 +0000 UTC m=+879.131182296" observedRunningTime="2025-12-04 09:57:13.829224935 +0000 UTC m=+879.726818688" watchObservedRunningTime="2025-12-04 09:57:13.835895 +0000 UTC m=+879.733488753" Dec 04 09:57:13 crc kubenswrapper[4693]: I1204 09:57:13.866240 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xt5z6" Dec 04 09:57:14 crc kubenswrapper[4693]: I1204 09:57:14.972928 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xt5z6"] Dec 04 09:57:15 crc kubenswrapper[4693]: I1204 09:57:15.820033 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xt5z6" podUID="5beedddc-0b81-470c-8ae2-b927f1fa899d" containerName="registry-server" containerID="cri-o://6bcee1c4d7d05c803f92393d5850ebb47957521dcb23de8c05593f7747e0e51c" gracePeriod=2 Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.158574 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xt5z6" Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.297990 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5beedddc-0b81-470c-8ae2-b927f1fa899d-utilities\") pod \"5beedddc-0b81-470c-8ae2-b927f1fa899d\" (UID: \"5beedddc-0b81-470c-8ae2-b927f1fa899d\") " Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.298046 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5beedddc-0b81-470c-8ae2-b927f1fa899d-catalog-content\") pod \"5beedddc-0b81-470c-8ae2-b927f1fa899d\" (UID: \"5beedddc-0b81-470c-8ae2-b927f1fa899d\") " Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.298149 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnznh\" (UniqueName: \"kubernetes.io/projected/5beedddc-0b81-470c-8ae2-b927f1fa899d-kube-api-access-nnznh\") pod \"5beedddc-0b81-470c-8ae2-b927f1fa899d\" (UID: \"5beedddc-0b81-470c-8ae2-b927f1fa899d\") " Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.299141 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5beedddc-0b81-470c-8ae2-b927f1fa899d-utilities" (OuterVolumeSpecName: "utilities") pod "5beedddc-0b81-470c-8ae2-b927f1fa899d" (UID: "5beedddc-0b81-470c-8ae2-b927f1fa899d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.303465 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5beedddc-0b81-470c-8ae2-b927f1fa899d-kube-api-access-nnznh" (OuterVolumeSpecName: "kube-api-access-nnznh") pod "5beedddc-0b81-470c-8ae2-b927f1fa899d" (UID: "5beedddc-0b81-470c-8ae2-b927f1fa899d"). InnerVolumeSpecName "kube-api-access-nnznh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.316796 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5beedddc-0b81-470c-8ae2-b927f1fa899d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5beedddc-0b81-470c-8ae2-b927f1fa899d" (UID: "5beedddc-0b81-470c-8ae2-b927f1fa899d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.399915 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnznh\" (UniqueName: \"kubernetes.io/projected/5beedddc-0b81-470c-8ae2-b927f1fa899d-kube-api-access-nnznh\") on node \"crc\" DevicePath \"\"" Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.400000 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5beedddc-0b81-470c-8ae2-b927f1fa899d-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.400015 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5beedddc-0b81-470c-8ae2-b927f1fa899d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.802504 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-897g8" Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.828644 4693 generic.go:334] "Generic (PLEG): container finished" podID="5beedddc-0b81-470c-8ae2-b927f1fa899d" containerID="6bcee1c4d7d05c803f92393d5850ebb47957521dcb23de8c05593f7747e0e51c" exitCode=0 Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.828692 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt5z6" event={"ID":"5beedddc-0b81-470c-8ae2-b927f1fa899d","Type":"ContainerDied","Data":"6bcee1c4d7d05c803f92393d5850ebb47957521dcb23de8c05593f7747e0e51c"} Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.828718 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xt5z6" Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.828740 4693 scope.go:117] "RemoveContainer" containerID="6bcee1c4d7d05c803f92393d5850ebb47957521dcb23de8c05593f7747e0e51c" Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.828723 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt5z6" event={"ID":"5beedddc-0b81-470c-8ae2-b927f1fa899d","Type":"ContainerDied","Data":"ea0f8267997777f199f97810af6844e10e28df5198bb34d7ff1ad5fa906d13b8"} Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.845391 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xt5z6"] Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.846927 4693 scope.go:117] "RemoveContainer" containerID="73425ac4bc86bdd090071f6922b25a26ab3908c42578def41069033dff286eee" Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.849683 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xt5z6"] Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.862138 4693 scope.go:117] "RemoveContainer" containerID="14341d98bbbce5d0160ab04e17996844a353d7b45dd6bdfdd8796c647c688770" Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.877826 4693 scope.go:117] "RemoveContainer" containerID="6bcee1c4d7d05c803f92393d5850ebb47957521dcb23de8c05593f7747e0e51c" Dec 04 09:57:16 crc kubenswrapper[4693]: E1204 09:57:16.878345 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bcee1c4d7d05c803f92393d5850ebb47957521dcb23de8c05593f7747e0e51c\": container with ID starting with 6bcee1c4d7d05c803f92393d5850ebb47957521dcb23de8c05593f7747e0e51c not found: ID does not exist" containerID="6bcee1c4d7d05c803f92393d5850ebb47957521dcb23de8c05593f7747e0e51c" Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.878381 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bcee1c4d7d05c803f92393d5850ebb47957521dcb23de8c05593f7747e0e51c"} err="failed to get container status \"6bcee1c4d7d05c803f92393d5850ebb47957521dcb23de8c05593f7747e0e51c\": rpc error: code = NotFound desc = could not find container \"6bcee1c4d7d05c803f92393d5850ebb47957521dcb23de8c05593f7747e0e51c\": container with ID starting with 6bcee1c4d7d05c803f92393d5850ebb47957521dcb23de8c05593f7747e0e51c not found: ID does not exist" Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.878421 4693 scope.go:117] "RemoveContainer" containerID="73425ac4bc86bdd090071f6922b25a26ab3908c42578def41069033dff286eee" Dec 04 09:57:16 crc kubenswrapper[4693]: E1204 09:57:16.878723 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73425ac4bc86bdd090071f6922b25a26ab3908c42578def41069033dff286eee\": container with ID starting with 73425ac4bc86bdd090071f6922b25a26ab3908c42578def41069033dff286eee not found: ID does not exist" containerID="73425ac4bc86bdd090071f6922b25a26ab3908c42578def41069033dff286eee" Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.878777 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73425ac4bc86bdd090071f6922b25a26ab3908c42578def41069033dff286eee"} err="failed to get container status \"73425ac4bc86bdd090071f6922b25a26ab3908c42578def41069033dff286eee\": rpc error: code = NotFound desc = could not find container \"73425ac4bc86bdd090071f6922b25a26ab3908c42578def41069033dff286eee\": container with ID starting with 73425ac4bc86bdd090071f6922b25a26ab3908c42578def41069033dff286eee not found: ID does not exist" Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.878804 4693 scope.go:117] "RemoveContainer" containerID="14341d98bbbce5d0160ab04e17996844a353d7b45dd6bdfdd8796c647c688770" Dec 04 09:57:16 crc kubenswrapper[4693]: E1204 09:57:16.879065 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14341d98bbbce5d0160ab04e17996844a353d7b45dd6bdfdd8796c647c688770\": container with ID starting with 14341d98bbbce5d0160ab04e17996844a353d7b45dd6bdfdd8796c647c688770 not found: ID does not exist" containerID="14341d98bbbce5d0160ab04e17996844a353d7b45dd6bdfdd8796c647c688770" Dec 04 09:57:16 crc kubenswrapper[4693]: I1204 09:57:16.879091 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14341d98bbbce5d0160ab04e17996844a353d7b45dd6bdfdd8796c647c688770"} err="failed to get container status \"14341d98bbbce5d0160ab04e17996844a353d7b45dd6bdfdd8796c647c688770\": rpc error: code = NotFound desc = could not find container \"14341d98bbbce5d0160ab04e17996844a353d7b45dd6bdfdd8796c647c688770\": container with ID starting with 14341d98bbbce5d0160ab04e17996844a353d7b45dd6bdfdd8796c647c688770 not found: ID does not exist" Dec 04 09:57:17 crc kubenswrapper[4693]: I1204 09:57:17.121555 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:17 crc kubenswrapper[4693]: I1204 09:57:17.121965 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:17 crc kubenswrapper[4693]: I1204 09:57:17.128387 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:17 crc kubenswrapper[4693]: I1204 09:57:17.840398 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c6ff486f7-9ll2b" Dec 04 09:57:17 crc kubenswrapper[4693]: I1204 09:57:17.892648 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-mhzjn"] Dec 04 09:57:17 crc kubenswrapper[4693]: I1204 09:57:17.980233 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rdfj7"] Dec 04 09:57:17 crc kubenswrapper[4693]: E1204 09:57:17.980465 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740e10f0-dfaf-4992-8884-88c6957f9181" containerName="registry-server" Dec 04 09:57:17 crc kubenswrapper[4693]: I1204 09:57:17.980476 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="740e10f0-dfaf-4992-8884-88c6957f9181" containerName="registry-server" Dec 04 09:57:17 crc kubenswrapper[4693]: E1204 09:57:17.980514 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740e10f0-dfaf-4992-8884-88c6957f9181" containerName="extract-utilities" Dec 04 09:57:17 crc kubenswrapper[4693]: I1204 09:57:17.980520 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="740e10f0-dfaf-4992-8884-88c6957f9181" containerName="extract-utilities" Dec 04 09:57:17 crc kubenswrapper[4693]: E1204 09:57:17.980528 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5beedddc-0b81-470c-8ae2-b927f1fa899d" containerName="extract-content" Dec 04 09:57:17 crc kubenswrapper[4693]: I1204 09:57:17.980533 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5beedddc-0b81-470c-8ae2-b927f1fa899d" containerName="extract-content" Dec 04 09:57:17 crc kubenswrapper[4693]: E1204 09:57:17.980542 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5beedddc-0b81-470c-8ae2-b927f1fa899d" containerName="registry-server" Dec 04 09:57:17 crc kubenswrapper[4693]: I1204 09:57:17.980547 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5beedddc-0b81-470c-8ae2-b927f1fa899d" containerName="registry-server" Dec 04 09:57:17 crc kubenswrapper[4693]: E1204 09:57:17.980555 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740e10f0-dfaf-4992-8884-88c6957f9181" containerName="extract-content" Dec 04 09:57:17 crc kubenswrapper[4693]: I1204 09:57:17.980577 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="740e10f0-dfaf-4992-8884-88c6957f9181" containerName="extract-content" Dec 04 09:57:17 crc kubenswrapper[4693]: E1204 09:57:17.980590 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5beedddc-0b81-470c-8ae2-b927f1fa899d" containerName="extract-utilities" Dec 04 09:57:17 crc kubenswrapper[4693]: I1204 09:57:17.980596 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5beedddc-0b81-470c-8ae2-b927f1fa899d" containerName="extract-utilities" Dec 04 09:57:17 crc kubenswrapper[4693]: I1204 09:57:17.980709 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="740e10f0-dfaf-4992-8884-88c6957f9181" containerName="registry-server" Dec 04 09:57:17 crc kubenswrapper[4693]: I1204 09:57:17.980719 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="5beedddc-0b81-470c-8ae2-b927f1fa899d" containerName="registry-server" Dec 04 09:57:17 crc kubenswrapper[4693]: I1204 09:57:17.981502 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rdfj7" Dec 04 09:57:17 crc kubenswrapper[4693]: I1204 09:57:17.993055 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rdfj7"] Dec 04 09:57:18 crc kubenswrapper[4693]: I1204 09:57:18.126423 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596b1459-2113-46a0-9862-3c670679047f-catalog-content\") pod \"community-operators-rdfj7\" (UID: \"596b1459-2113-46a0-9862-3c670679047f\") " pod="openshift-marketplace/community-operators-rdfj7" Dec 04 09:57:18 crc kubenswrapper[4693]: I1204 09:57:18.126496 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7plff\" (UniqueName: \"kubernetes.io/projected/596b1459-2113-46a0-9862-3c670679047f-kube-api-access-7plff\") pod \"community-operators-rdfj7\" (UID: \"596b1459-2113-46a0-9862-3c670679047f\") " pod="openshift-marketplace/community-operators-rdfj7" Dec 04 09:57:18 crc kubenswrapper[4693]: I1204 09:57:18.126549 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596b1459-2113-46a0-9862-3c670679047f-utilities\") pod \"community-operators-rdfj7\" (UID: \"596b1459-2113-46a0-9862-3c670679047f\") " pod="openshift-marketplace/community-operators-rdfj7" Dec 04 09:57:18 crc kubenswrapper[4693]: I1204 09:57:18.227435 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7plff\" (UniqueName: \"kubernetes.io/projected/596b1459-2113-46a0-9862-3c670679047f-kube-api-access-7plff\") pod \"community-operators-rdfj7\" (UID: \"596b1459-2113-46a0-9862-3c670679047f\") " pod="openshift-marketplace/community-operators-rdfj7" Dec 04 09:57:18 crc kubenswrapper[4693]: I1204 09:57:18.227482 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596b1459-2113-46a0-9862-3c670679047f-utilities\") pod \"community-operators-rdfj7\" (UID: \"596b1459-2113-46a0-9862-3c670679047f\") " pod="openshift-marketplace/community-operators-rdfj7" Dec 04 09:57:18 crc kubenswrapper[4693]: I1204 09:57:18.227539 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596b1459-2113-46a0-9862-3c670679047f-catalog-content\") pod \"community-operators-rdfj7\" (UID: \"596b1459-2113-46a0-9862-3c670679047f\") " pod="openshift-marketplace/community-operators-rdfj7" Dec 04 09:57:18 crc kubenswrapper[4693]: I1204 09:57:18.227936 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596b1459-2113-46a0-9862-3c670679047f-catalog-content\") pod \"community-operators-rdfj7\" (UID: \"596b1459-2113-46a0-9862-3c670679047f\") " pod="openshift-marketplace/community-operators-rdfj7" Dec 04 09:57:18 crc kubenswrapper[4693]: I1204 09:57:18.228022 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596b1459-2113-46a0-9862-3c670679047f-utilities\") pod \"community-operators-rdfj7\" (UID: \"596b1459-2113-46a0-9862-3c670679047f\") " pod="openshift-marketplace/community-operators-rdfj7" Dec 04 09:57:18 crc kubenswrapper[4693]: I1204 09:57:18.248599 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7plff\" (UniqueName: \"kubernetes.io/projected/596b1459-2113-46a0-9862-3c670679047f-kube-api-access-7plff\") pod \"community-operators-rdfj7\" (UID: \"596b1459-2113-46a0-9862-3c670679047f\") " pod="openshift-marketplace/community-operators-rdfj7" Dec 04 09:57:18 crc kubenswrapper[4693]: I1204 09:57:18.294455 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rdfj7" Dec 04 09:57:18 crc kubenswrapper[4693]: I1204 09:57:18.474096 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5beedddc-0b81-470c-8ae2-b927f1fa899d" path="/var/lib/kubelet/pods/5beedddc-0b81-470c-8ae2-b927f1fa899d/volumes" Dec 04 09:57:18 crc kubenswrapper[4693]: I1204 09:57:18.777971 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rdfj7"] Dec 04 09:57:18 crc kubenswrapper[4693]: I1204 09:57:18.841893 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdfj7" event={"ID":"596b1459-2113-46a0-9862-3c670679047f","Type":"ContainerStarted","Data":"8838e4b6b5d7a1fe2cca437545d2fe4496d7c3b587c7f1787052c57f7c228e10"} Dec 04 09:57:19 crc kubenswrapper[4693]: I1204 09:57:19.849641 4693 generic.go:334] "Generic (PLEG): container finished" podID="596b1459-2113-46a0-9862-3c670679047f" containerID="7866020d13fe57642d79a4a967a0fc2439db0b991339ef5bf568d00ec85a7d75" exitCode=0 Dec 04 09:57:19 crc kubenswrapper[4693]: I1204 09:57:19.849696 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdfj7" event={"ID":"596b1459-2113-46a0-9862-3c670679047f","Type":"ContainerDied","Data":"7866020d13fe57642d79a4a967a0fc2439db0b991339ef5bf568d00ec85a7d75"} Dec 04 09:57:20 crc kubenswrapper[4693]: I1204 09:57:20.858479 4693 generic.go:334] "Generic (PLEG): container finished" podID="596b1459-2113-46a0-9862-3c670679047f" containerID="dec40be698394a7177a608778f65e7856ad712dea10a17818be2ba43cff391c8" exitCode=0 Dec 04 09:57:20 crc kubenswrapper[4693]: I1204 09:57:20.858589 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdfj7" event={"ID":"596b1459-2113-46a0-9862-3c670679047f","Type":"ContainerDied","Data":"dec40be698394a7177a608778f65e7856ad712dea10a17818be2ba43cff391c8"} Dec 04 09:57:21 crc kubenswrapper[4693]: I1204 09:57:21.868167 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdfj7" event={"ID":"596b1459-2113-46a0-9862-3c670679047f","Type":"ContainerStarted","Data":"a0a27d4117de2d02fb9ab4516150d084234a13fbce5bb8a8b37982bd3747f3cf"} Dec 04 09:57:21 crc kubenswrapper[4693]: I1204 09:57:21.888841 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rdfj7" podStartSLOduration=3.462484409 podStartE2EDuration="4.888818247s" podCreationTimestamp="2025-12-04 09:57:17 +0000 UTC" firstStartedPulling="2025-12-04 09:57:19.852067699 +0000 UTC m=+885.749661452" lastFinishedPulling="2025-12-04 09:57:21.278401537 +0000 UTC m=+887.175995290" observedRunningTime="2025-12-04 09:57:21.887776108 +0000 UTC m=+887.785369861" watchObservedRunningTime="2025-12-04 09:57:21.888818247 +0000 UTC m=+887.786412020" Dec 04 09:57:26 crc kubenswrapper[4693]: I1204 09:57:26.751520 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w2kzc" Dec 04 09:57:28 crc kubenswrapper[4693]: I1204 09:57:28.295248 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rdfj7" Dec 04 09:57:28 crc kubenswrapper[4693]: I1204 09:57:28.295813 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rdfj7" Dec 04 09:57:28 crc kubenswrapper[4693]: I1204 09:57:28.341917 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rdfj7" Dec 04 09:57:28 crc kubenswrapper[4693]: I1204 09:57:28.950707 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rdfj7" Dec 04 09:57:28 crc kubenswrapper[4693]: I1204 09:57:28.984953 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rdfj7"] Dec 04 09:57:30 crc kubenswrapper[4693]: I1204 09:57:30.920371 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rdfj7" podUID="596b1459-2113-46a0-9862-3c670679047f" containerName="registry-server" containerID="cri-o://a0a27d4117de2d02fb9ab4516150d084234a13fbce5bb8a8b37982bd3747f3cf" gracePeriod=2 Dec 04 09:57:31 crc kubenswrapper[4693]: I1204 09:57:31.352139 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rdfj7" Dec 04 09:57:31 crc kubenswrapper[4693]: I1204 09:57:31.493954 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596b1459-2113-46a0-9862-3c670679047f-catalog-content\") pod \"596b1459-2113-46a0-9862-3c670679047f\" (UID: \"596b1459-2113-46a0-9862-3c670679047f\") " Dec 04 09:57:31 crc kubenswrapper[4693]: I1204 09:57:31.494019 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596b1459-2113-46a0-9862-3c670679047f-utilities\") pod \"596b1459-2113-46a0-9862-3c670679047f\" (UID: \"596b1459-2113-46a0-9862-3c670679047f\") " Dec 04 09:57:31 crc kubenswrapper[4693]: I1204 09:57:31.494097 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7plff\" (UniqueName: \"kubernetes.io/projected/596b1459-2113-46a0-9862-3c670679047f-kube-api-access-7plff\") pod \"596b1459-2113-46a0-9862-3c670679047f\" (UID: \"596b1459-2113-46a0-9862-3c670679047f\") " Dec 04 09:57:31 crc kubenswrapper[4693]: I1204 09:57:31.495998 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/596b1459-2113-46a0-9862-3c670679047f-utilities" (OuterVolumeSpecName: "utilities") pod "596b1459-2113-46a0-9862-3c670679047f" (UID: "596b1459-2113-46a0-9862-3c670679047f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:57:31 crc kubenswrapper[4693]: I1204 09:57:31.500314 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/596b1459-2113-46a0-9862-3c670679047f-kube-api-access-7plff" (OuterVolumeSpecName: "kube-api-access-7plff") pod "596b1459-2113-46a0-9862-3c670679047f" (UID: "596b1459-2113-46a0-9862-3c670679047f"). InnerVolumeSpecName "kube-api-access-7plff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:57:31 crc kubenswrapper[4693]: I1204 09:57:31.542841 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/596b1459-2113-46a0-9862-3c670679047f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "596b1459-2113-46a0-9862-3c670679047f" (UID: "596b1459-2113-46a0-9862-3c670679047f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:57:31 crc kubenswrapper[4693]: I1204 09:57:31.596000 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596b1459-2113-46a0-9862-3c670679047f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:57:31 crc kubenswrapper[4693]: I1204 09:57:31.596034 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596b1459-2113-46a0-9862-3c670679047f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:57:31 crc kubenswrapper[4693]: I1204 09:57:31.596044 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7plff\" (UniqueName: \"kubernetes.io/projected/596b1459-2113-46a0-9862-3c670679047f-kube-api-access-7plff\") on node \"crc\" DevicePath \"\"" Dec 04 09:57:31 crc kubenswrapper[4693]: I1204 09:57:31.927197 4693 generic.go:334] "Generic (PLEG): container finished" podID="596b1459-2113-46a0-9862-3c670679047f" containerID="a0a27d4117de2d02fb9ab4516150d084234a13fbce5bb8a8b37982bd3747f3cf" exitCode=0 Dec 04 09:57:31 crc kubenswrapper[4693]: I1204 09:57:31.927258 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rdfj7" Dec 04 09:57:31 crc kubenswrapper[4693]: I1204 09:57:31.927308 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdfj7" event={"ID":"596b1459-2113-46a0-9862-3c670679047f","Type":"ContainerDied","Data":"a0a27d4117de2d02fb9ab4516150d084234a13fbce5bb8a8b37982bd3747f3cf"} Dec 04 09:57:31 crc kubenswrapper[4693]: I1204 09:57:31.927740 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdfj7" event={"ID":"596b1459-2113-46a0-9862-3c670679047f","Type":"ContainerDied","Data":"8838e4b6b5d7a1fe2cca437545d2fe4496d7c3b587c7f1787052c57f7c228e10"} Dec 04 09:57:31 crc kubenswrapper[4693]: I1204 09:57:31.927772 4693 scope.go:117] "RemoveContainer" containerID="a0a27d4117de2d02fb9ab4516150d084234a13fbce5bb8a8b37982bd3747f3cf" Dec 04 09:57:31 crc kubenswrapper[4693]: I1204 09:57:31.945122 4693 scope.go:117] "RemoveContainer" containerID="dec40be698394a7177a608778f65e7856ad712dea10a17818be2ba43cff391c8" Dec 04 09:57:31 crc kubenswrapper[4693]: I1204 09:57:31.956532 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rdfj7"] Dec 04 09:57:31 crc kubenswrapper[4693]: I1204 09:57:31.961159 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rdfj7"] Dec 04 09:57:31 crc kubenswrapper[4693]: I1204 09:57:31.988444 4693 scope.go:117] "RemoveContainer" containerID="7866020d13fe57642d79a4a967a0fc2439db0b991339ef5bf568d00ec85a7d75" Dec 04 09:57:32 crc kubenswrapper[4693]: I1204 09:57:32.005973 4693 scope.go:117] "RemoveContainer" containerID="a0a27d4117de2d02fb9ab4516150d084234a13fbce5bb8a8b37982bd3747f3cf" Dec 04 09:57:32 crc kubenswrapper[4693]: E1204 09:57:32.006307 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0a27d4117de2d02fb9ab4516150d084234a13fbce5bb8a8b37982bd3747f3cf\": container with ID starting with a0a27d4117de2d02fb9ab4516150d084234a13fbce5bb8a8b37982bd3747f3cf not found: ID does not exist" containerID="a0a27d4117de2d02fb9ab4516150d084234a13fbce5bb8a8b37982bd3747f3cf" Dec 04 09:57:32 crc kubenswrapper[4693]: I1204 09:57:32.006369 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0a27d4117de2d02fb9ab4516150d084234a13fbce5bb8a8b37982bd3747f3cf"} err="failed to get container status \"a0a27d4117de2d02fb9ab4516150d084234a13fbce5bb8a8b37982bd3747f3cf\": rpc error: code = NotFound desc = could not find container \"a0a27d4117de2d02fb9ab4516150d084234a13fbce5bb8a8b37982bd3747f3cf\": container with ID starting with a0a27d4117de2d02fb9ab4516150d084234a13fbce5bb8a8b37982bd3747f3cf not found: ID does not exist" Dec 04 09:57:32 crc kubenswrapper[4693]: I1204 09:57:32.006403 4693 scope.go:117] "RemoveContainer" containerID="dec40be698394a7177a608778f65e7856ad712dea10a17818be2ba43cff391c8" Dec 04 09:57:32 crc kubenswrapper[4693]: E1204 09:57:32.006720 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dec40be698394a7177a608778f65e7856ad712dea10a17818be2ba43cff391c8\": container with ID starting with dec40be698394a7177a608778f65e7856ad712dea10a17818be2ba43cff391c8 not found: ID does not exist" containerID="dec40be698394a7177a608778f65e7856ad712dea10a17818be2ba43cff391c8" Dec 04 09:57:32 crc kubenswrapper[4693]: I1204 09:57:32.006748 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dec40be698394a7177a608778f65e7856ad712dea10a17818be2ba43cff391c8"} err="failed to get container status \"dec40be698394a7177a608778f65e7856ad712dea10a17818be2ba43cff391c8\": rpc error: code = NotFound desc = could not find container \"dec40be698394a7177a608778f65e7856ad712dea10a17818be2ba43cff391c8\": container with ID starting with dec40be698394a7177a608778f65e7856ad712dea10a17818be2ba43cff391c8 not found: ID does not exist" Dec 04 09:57:32 crc kubenswrapper[4693]: I1204 09:57:32.006766 4693 scope.go:117] "RemoveContainer" containerID="7866020d13fe57642d79a4a967a0fc2439db0b991339ef5bf568d00ec85a7d75" Dec 04 09:57:32 crc kubenswrapper[4693]: E1204 09:57:32.006993 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7866020d13fe57642d79a4a967a0fc2439db0b991339ef5bf568d00ec85a7d75\": container with ID starting with 7866020d13fe57642d79a4a967a0fc2439db0b991339ef5bf568d00ec85a7d75 not found: ID does not exist" containerID="7866020d13fe57642d79a4a967a0fc2439db0b991339ef5bf568d00ec85a7d75" Dec 04 09:57:32 crc kubenswrapper[4693]: I1204 09:57:32.007021 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7866020d13fe57642d79a4a967a0fc2439db0b991339ef5bf568d00ec85a7d75"} err="failed to get container status \"7866020d13fe57642d79a4a967a0fc2439db0b991339ef5bf568d00ec85a7d75\": rpc error: code = NotFound desc = could not find container \"7866020d13fe57642d79a4a967a0fc2439db0b991339ef5bf568d00ec85a7d75\": container with ID starting with 7866020d13fe57642d79a4a967a0fc2439db0b991339ef5bf568d00ec85a7d75 not found: ID does not exist" Dec 04 09:57:32 crc kubenswrapper[4693]: I1204 09:57:32.473236 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="596b1459-2113-46a0-9862-3c670679047f" path="/var/lib/kubelet/pods/596b1459-2113-46a0-9862-3c670679047f/volumes" Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.249749 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw"] Dec 04 09:57:40 crc kubenswrapper[4693]: E1204 09:57:40.250755 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596b1459-2113-46a0-9862-3c670679047f" containerName="extract-content" Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.250767 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="596b1459-2113-46a0-9862-3c670679047f" containerName="extract-content" Dec 04 09:57:40 crc kubenswrapper[4693]: E1204 09:57:40.250782 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596b1459-2113-46a0-9862-3c670679047f" containerName="registry-server" Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.250788 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="596b1459-2113-46a0-9862-3c670679047f" containerName="registry-server" Dec 04 09:57:40 crc kubenswrapper[4693]: E1204 09:57:40.250799 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596b1459-2113-46a0-9862-3c670679047f" containerName="extract-utilities" Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.250806 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="596b1459-2113-46a0-9862-3c670679047f" containerName="extract-utilities" Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.250909 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="596b1459-2113-46a0-9862-3c670679047f" containerName="registry-server" Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.251684 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw" Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.253388 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.270543 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw"] Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.346566 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cqtm\" (UniqueName: \"kubernetes.io/projected/b6e8fbed-f202-40ea-ad04-5bbbb338f8e1-kube-api-access-5cqtm\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw\" (UID: \"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw" Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.346619 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6e8fbed-f202-40ea-ad04-5bbbb338f8e1-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw\" (UID: \"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw" Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.346650 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6e8fbed-f202-40ea-ad04-5bbbb338f8e1-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw\" (UID: \"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw" Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.447380 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cqtm\" (UniqueName: \"kubernetes.io/projected/b6e8fbed-f202-40ea-ad04-5bbbb338f8e1-kube-api-access-5cqtm\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw\" (UID: \"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw" Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.447430 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6e8fbed-f202-40ea-ad04-5bbbb338f8e1-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw\" (UID: \"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw" Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.447447 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6e8fbed-f202-40ea-ad04-5bbbb338f8e1-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw\" (UID: \"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw" Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.447958 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6e8fbed-f202-40ea-ad04-5bbbb338f8e1-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw\" (UID: \"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw" Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.447960 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6e8fbed-f202-40ea-ad04-5bbbb338f8e1-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw\" (UID: \"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw" Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.472179 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cqtm\" (UniqueName: \"kubernetes.io/projected/b6e8fbed-f202-40ea-ad04-5bbbb338f8e1-kube-api-access-5cqtm\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw\" (UID: \"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw" Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.570585 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw" Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.956584 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw"] Dec 04 09:57:40 crc kubenswrapper[4693]: I1204 09:57:40.991970 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw" event={"ID":"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1","Type":"ContainerStarted","Data":"ec7caca4f252b721d6cc1b2c68592bbf41dd976016dcbaba7adbdc8e242b2331"} Dec 04 09:57:42 crc kubenswrapper[4693]: I1204 09:57:42.004884 4693 generic.go:334] "Generic (PLEG): container finished" podID="b6e8fbed-f202-40ea-ad04-5bbbb338f8e1" containerID="a73953d040115a65d17fd2d70aaa605b8f2d3d15cf508a5c57d102c303f92f00" exitCode=0 Dec 04 09:57:42 crc kubenswrapper[4693]: I1204 09:57:42.005001 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw" event={"ID":"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1","Type":"ContainerDied","Data":"a73953d040115a65d17fd2d70aaa605b8f2d3d15cf508a5c57d102c303f92f00"} Dec 04 09:57:42 crc kubenswrapper[4693]: I1204 09:57:42.944922 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-mhzjn" podUID="18edbe10-dd1c-47a9-b8de-5f2d53306f2e" containerName="console" containerID="cri-o://f6dbc0e3be88db0b7b40fc2d4a4ba40aac880b8dcee1c49fe74d6a8e5f94d9c5" gracePeriod=15 Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.634148 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-mhzjn_18edbe10-dd1c-47a9-b8de-5f2d53306f2e/console/0.log" Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.634675 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.790314 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-console-config\") pod \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.790411 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-trusted-ca-bundle\") pod \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.790445 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-console-oauth-config\") pod \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.790477 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-oauth-serving-cert\") pod \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.790537 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-service-ca\") pod \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.790593 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhdnh\" (UniqueName: \"kubernetes.io/projected/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-kube-api-access-nhdnh\") pod \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.790632 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-console-serving-cert\") pod \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\" (UID: \"18edbe10-dd1c-47a9-b8de-5f2d53306f2e\") " Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.791546 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-service-ca" (OuterVolumeSpecName: "service-ca") pod "18edbe10-dd1c-47a9-b8de-5f2d53306f2e" (UID: "18edbe10-dd1c-47a9-b8de-5f2d53306f2e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.791765 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-console-config" (OuterVolumeSpecName: "console-config") pod "18edbe10-dd1c-47a9-b8de-5f2d53306f2e" (UID: "18edbe10-dd1c-47a9-b8de-5f2d53306f2e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.791798 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "18edbe10-dd1c-47a9-b8de-5f2d53306f2e" (UID: "18edbe10-dd1c-47a9-b8de-5f2d53306f2e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.792018 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "18edbe10-dd1c-47a9-b8de-5f2d53306f2e" (UID: "18edbe10-dd1c-47a9-b8de-5f2d53306f2e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.792220 4693 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-console-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.792235 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.792244 4693 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.792253 4693 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.798692 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "18edbe10-dd1c-47a9-b8de-5f2d53306f2e" (UID: "18edbe10-dd1c-47a9-b8de-5f2d53306f2e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.798715 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-kube-api-access-nhdnh" (OuterVolumeSpecName: "kube-api-access-nhdnh") pod "18edbe10-dd1c-47a9-b8de-5f2d53306f2e" (UID: "18edbe10-dd1c-47a9-b8de-5f2d53306f2e"). InnerVolumeSpecName "kube-api-access-nhdnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.799082 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "18edbe10-dd1c-47a9-b8de-5f2d53306f2e" (UID: "18edbe10-dd1c-47a9-b8de-5f2d53306f2e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.893036 4693 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.893076 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhdnh\" (UniqueName: \"kubernetes.io/projected/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-kube-api-access-nhdnh\") on node \"crc\" DevicePath \"\"" Dec 04 09:57:43 crc kubenswrapper[4693]: I1204 09:57:43.893087 4693 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18edbe10-dd1c-47a9-b8de-5f2d53306f2e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:57:44 crc kubenswrapper[4693]: I1204 09:57:44.021961 4693 generic.go:334] "Generic (PLEG): container finished" podID="b6e8fbed-f202-40ea-ad04-5bbbb338f8e1" containerID="e5bc1635372aebf7f66adceed043ab93e1ac3d4bc53b7c9d17fe16758f7134b9" exitCode=0 Dec 04 09:57:44 crc kubenswrapper[4693]: I1204 09:57:44.022035 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw" event={"ID":"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1","Type":"ContainerDied","Data":"e5bc1635372aebf7f66adceed043ab93e1ac3d4bc53b7c9d17fe16758f7134b9"} Dec 04 09:57:44 crc kubenswrapper[4693]: I1204 09:57:44.024204 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-mhzjn_18edbe10-dd1c-47a9-b8de-5f2d53306f2e/console/0.log" Dec 04 09:57:44 crc kubenswrapper[4693]: I1204 09:57:44.024367 4693 generic.go:334] "Generic (PLEG): container finished" podID="18edbe10-dd1c-47a9-b8de-5f2d53306f2e" containerID="f6dbc0e3be88db0b7b40fc2d4a4ba40aac880b8dcee1c49fe74d6a8e5f94d9c5" exitCode=2 Dec 04 09:57:44 crc kubenswrapper[4693]: I1204 09:57:44.024416 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mhzjn" Dec 04 09:57:44 crc kubenswrapper[4693]: I1204 09:57:44.024417 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mhzjn" event={"ID":"18edbe10-dd1c-47a9-b8de-5f2d53306f2e","Type":"ContainerDied","Data":"f6dbc0e3be88db0b7b40fc2d4a4ba40aac880b8dcee1c49fe74d6a8e5f94d9c5"} Dec 04 09:57:44 crc kubenswrapper[4693]: I1204 09:57:44.024680 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mhzjn" event={"ID":"18edbe10-dd1c-47a9-b8de-5f2d53306f2e","Type":"ContainerDied","Data":"dee2200753a46f9a527a6d5d716f2e045742a754badd582ffa13e62bae80f73e"} Dec 04 09:57:44 crc kubenswrapper[4693]: I1204 09:57:44.024699 4693 scope.go:117] "RemoveContainer" containerID="f6dbc0e3be88db0b7b40fc2d4a4ba40aac880b8dcee1c49fe74d6a8e5f94d9c5" Dec 04 09:57:44 crc kubenswrapper[4693]: I1204 09:57:44.060758 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-mhzjn"] Dec 04 09:57:44 crc kubenswrapper[4693]: I1204 09:57:44.061749 4693 scope.go:117] "RemoveContainer" containerID="f6dbc0e3be88db0b7b40fc2d4a4ba40aac880b8dcee1c49fe74d6a8e5f94d9c5" Dec 04 09:57:44 crc kubenswrapper[4693]: E1204 09:57:44.062167 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6dbc0e3be88db0b7b40fc2d4a4ba40aac880b8dcee1c49fe74d6a8e5f94d9c5\": container with ID starting with f6dbc0e3be88db0b7b40fc2d4a4ba40aac880b8dcee1c49fe74d6a8e5f94d9c5 not found: ID does not exist" containerID="f6dbc0e3be88db0b7b40fc2d4a4ba40aac880b8dcee1c49fe74d6a8e5f94d9c5" Dec 04 09:57:44 crc kubenswrapper[4693]: I1204 09:57:44.062204 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6dbc0e3be88db0b7b40fc2d4a4ba40aac880b8dcee1c49fe74d6a8e5f94d9c5"} err="failed to get container status \"f6dbc0e3be88db0b7b40fc2d4a4ba40aac880b8dcee1c49fe74d6a8e5f94d9c5\": rpc error: code = NotFound desc = could not find container \"f6dbc0e3be88db0b7b40fc2d4a4ba40aac880b8dcee1c49fe74d6a8e5f94d9c5\": container with ID starting with f6dbc0e3be88db0b7b40fc2d4a4ba40aac880b8dcee1c49fe74d6a8e5f94d9c5 not found: ID does not exist" Dec 04 09:57:44 crc kubenswrapper[4693]: I1204 09:57:44.064987 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-mhzjn"] Dec 04 09:57:44 crc kubenswrapper[4693]: I1204 09:57:44.471627 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18edbe10-dd1c-47a9-b8de-5f2d53306f2e" path="/var/lib/kubelet/pods/18edbe10-dd1c-47a9-b8de-5f2d53306f2e/volumes" Dec 04 09:57:44 crc kubenswrapper[4693]: I1204 09:57:44.617134 4693 patch_prober.go:28] interesting pod/console-f9d7485db-mhzjn container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 09:57:44 crc kubenswrapper[4693]: I1204 09:57:44.617613 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-mhzjn" podUID="18edbe10-dd1c-47a9-b8de-5f2d53306f2e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.28:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 09:57:45 crc kubenswrapper[4693]: I1204 09:57:45.032100 4693 generic.go:334] "Generic (PLEG): container finished" podID="b6e8fbed-f202-40ea-ad04-5bbbb338f8e1" containerID="13c29fea3860ff6549038a6710690ec65f830a227faf530dbbae1e42355bf762" exitCode=0 Dec 04 09:57:45 crc kubenswrapper[4693]: I1204 09:57:45.032165 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw" event={"ID":"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1","Type":"ContainerDied","Data":"13c29fea3860ff6549038a6710690ec65f830a227faf530dbbae1e42355bf762"} Dec 04 09:57:46 crc kubenswrapper[4693]: I1204 09:57:46.250690 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw" Dec 04 09:57:46 crc kubenswrapper[4693]: I1204 09:57:46.423778 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6e8fbed-f202-40ea-ad04-5bbbb338f8e1-bundle\") pod \"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1\" (UID: \"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1\") " Dec 04 09:57:46 crc kubenswrapper[4693]: I1204 09:57:46.424112 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6e8fbed-f202-40ea-ad04-5bbbb338f8e1-util\") pod \"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1\" (UID: \"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1\") " Dec 04 09:57:46 crc kubenswrapper[4693]: I1204 09:57:46.424237 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cqtm\" (UniqueName: \"kubernetes.io/projected/b6e8fbed-f202-40ea-ad04-5bbbb338f8e1-kube-api-access-5cqtm\") pod \"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1\" (UID: \"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1\") " Dec 04 09:57:46 crc kubenswrapper[4693]: I1204 09:57:46.425039 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6e8fbed-f202-40ea-ad04-5bbbb338f8e1-bundle" (OuterVolumeSpecName: "bundle") pod "b6e8fbed-f202-40ea-ad04-5bbbb338f8e1" (UID: "b6e8fbed-f202-40ea-ad04-5bbbb338f8e1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:57:46 crc kubenswrapper[4693]: I1204 09:57:46.429689 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e8fbed-f202-40ea-ad04-5bbbb338f8e1-kube-api-access-5cqtm" (OuterVolumeSpecName: "kube-api-access-5cqtm") pod "b6e8fbed-f202-40ea-ad04-5bbbb338f8e1" (UID: "b6e8fbed-f202-40ea-ad04-5bbbb338f8e1"). InnerVolumeSpecName "kube-api-access-5cqtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:57:46 crc kubenswrapper[4693]: I1204 09:57:46.525614 4693 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6e8fbed-f202-40ea-ad04-5bbbb338f8e1-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:57:46 crc kubenswrapper[4693]: I1204 09:57:46.525846 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cqtm\" (UniqueName: \"kubernetes.io/projected/b6e8fbed-f202-40ea-ad04-5bbbb338f8e1-kube-api-access-5cqtm\") on node \"crc\" DevicePath \"\"" Dec 04 09:57:46 crc kubenswrapper[4693]: I1204 09:57:46.890938 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6e8fbed-f202-40ea-ad04-5bbbb338f8e1-util" (OuterVolumeSpecName: "util") pod "b6e8fbed-f202-40ea-ad04-5bbbb338f8e1" (UID: "b6e8fbed-f202-40ea-ad04-5bbbb338f8e1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:57:46 crc kubenswrapper[4693]: I1204 09:57:46.930093 4693 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6e8fbed-f202-40ea-ad04-5bbbb338f8e1-util\") on node \"crc\" DevicePath \"\"" Dec 04 09:57:47 crc kubenswrapper[4693]: I1204 09:57:47.044574 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw" event={"ID":"b6e8fbed-f202-40ea-ad04-5bbbb338f8e1","Type":"ContainerDied","Data":"ec7caca4f252b721d6cc1b2c68592bbf41dd976016dcbaba7adbdc8e242b2331"} Dec 04 09:57:47 crc kubenswrapper[4693]: I1204 09:57:47.044621 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec7caca4f252b721d6cc1b2c68592bbf41dd976016dcbaba7adbdc8e242b2331" Dec 04 09:57:47 crc kubenswrapper[4693]: I1204 09:57:47.044688 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.758956 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-cfc67b4f5-6t7cx"] Dec 04 09:57:55 crc kubenswrapper[4693]: E1204 09:57:55.759648 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e8fbed-f202-40ea-ad04-5bbbb338f8e1" containerName="util" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.759660 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e8fbed-f202-40ea-ad04-5bbbb338f8e1" containerName="util" Dec 04 09:57:55 crc kubenswrapper[4693]: E1204 09:57:55.759676 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e8fbed-f202-40ea-ad04-5bbbb338f8e1" containerName="extract" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.759682 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e8fbed-f202-40ea-ad04-5bbbb338f8e1" containerName="extract" Dec 04 09:57:55 crc kubenswrapper[4693]: E1204 09:57:55.759695 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e8fbed-f202-40ea-ad04-5bbbb338f8e1" containerName="pull" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.759701 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e8fbed-f202-40ea-ad04-5bbbb338f8e1" containerName="pull" Dec 04 09:57:55 crc kubenswrapper[4693]: E1204 09:57:55.759709 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18edbe10-dd1c-47a9-b8de-5f2d53306f2e" containerName="console" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.759714 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="18edbe10-dd1c-47a9-b8de-5f2d53306f2e" containerName="console" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.759805 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="18edbe10-dd1c-47a9-b8de-5f2d53306f2e" containerName="console" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.759815 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6e8fbed-f202-40ea-ad04-5bbbb338f8e1" containerName="extract" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.760200 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-cfc67b4f5-6t7cx" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.762215 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-dzrbb" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.762314 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.763363 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.763676 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.764502 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.782945 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-cfc67b4f5-6t7cx"] Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.844063 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjpm2\" (UniqueName: \"kubernetes.io/projected/990d09c4-a9e4-4233-ae12-3910f2937270-kube-api-access-wjpm2\") pod \"metallb-operator-controller-manager-cfc67b4f5-6t7cx\" (UID: \"990d09c4-a9e4-4233-ae12-3910f2937270\") " pod="metallb-system/metallb-operator-controller-manager-cfc67b4f5-6t7cx" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.844146 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/990d09c4-a9e4-4233-ae12-3910f2937270-webhook-cert\") pod \"metallb-operator-controller-manager-cfc67b4f5-6t7cx\" (UID: \"990d09c4-a9e4-4233-ae12-3910f2937270\") " pod="metallb-system/metallb-operator-controller-manager-cfc67b4f5-6t7cx" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.844185 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/990d09c4-a9e4-4233-ae12-3910f2937270-apiservice-cert\") pod \"metallb-operator-controller-manager-cfc67b4f5-6t7cx\" (UID: \"990d09c4-a9e4-4233-ae12-3910f2937270\") " pod="metallb-system/metallb-operator-controller-manager-cfc67b4f5-6t7cx" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.945437 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/990d09c4-a9e4-4233-ae12-3910f2937270-webhook-cert\") pod \"metallb-operator-controller-manager-cfc67b4f5-6t7cx\" (UID: \"990d09c4-a9e4-4233-ae12-3910f2937270\") " pod="metallb-system/metallb-operator-controller-manager-cfc67b4f5-6t7cx" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.945810 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/990d09c4-a9e4-4233-ae12-3910f2937270-apiservice-cert\") pod \"metallb-operator-controller-manager-cfc67b4f5-6t7cx\" (UID: \"990d09c4-a9e4-4233-ae12-3910f2937270\") " pod="metallb-system/metallb-operator-controller-manager-cfc67b4f5-6t7cx" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.945885 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjpm2\" (UniqueName: \"kubernetes.io/projected/990d09c4-a9e4-4233-ae12-3910f2937270-kube-api-access-wjpm2\") pod \"metallb-operator-controller-manager-cfc67b4f5-6t7cx\" (UID: \"990d09c4-a9e4-4233-ae12-3910f2937270\") " pod="metallb-system/metallb-operator-controller-manager-cfc67b4f5-6t7cx" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.950960 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/990d09c4-a9e4-4233-ae12-3910f2937270-webhook-cert\") pod \"metallb-operator-controller-manager-cfc67b4f5-6t7cx\" (UID: \"990d09c4-a9e4-4233-ae12-3910f2937270\") " pod="metallb-system/metallb-operator-controller-manager-cfc67b4f5-6t7cx" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.951190 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/990d09c4-a9e4-4233-ae12-3910f2937270-apiservice-cert\") pod \"metallb-operator-controller-manager-cfc67b4f5-6t7cx\" (UID: \"990d09c4-a9e4-4233-ae12-3910f2937270\") " pod="metallb-system/metallb-operator-controller-manager-cfc67b4f5-6t7cx" Dec 04 09:57:55 crc kubenswrapper[4693]: I1204 09:57:55.968122 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjpm2\" (UniqueName: \"kubernetes.io/projected/990d09c4-a9e4-4233-ae12-3910f2937270-kube-api-access-wjpm2\") pod \"metallb-operator-controller-manager-cfc67b4f5-6t7cx\" (UID: \"990d09c4-a9e4-4233-ae12-3910f2937270\") " pod="metallb-system/metallb-operator-controller-manager-cfc67b4f5-6t7cx" Dec 04 09:57:56 crc kubenswrapper[4693]: I1204 09:57:56.075880 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-cfc67b4f5-6t7cx" Dec 04 09:57:56 crc kubenswrapper[4693]: I1204 09:57:56.091513 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5f644b44db-65csk"] Dec 04 09:57:56 crc kubenswrapper[4693]: I1204 09:57:56.092354 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5f644b44db-65csk" Dec 04 09:57:56 crc kubenswrapper[4693]: I1204 09:57:56.094285 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-8ps9p" Dec 04 09:57:56 crc kubenswrapper[4693]: I1204 09:57:56.094341 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 04 09:57:56 crc kubenswrapper[4693]: I1204 09:57:56.094458 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 04 09:57:56 crc kubenswrapper[4693]: I1204 09:57:56.111170 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5f644b44db-65csk"] Dec 04 09:57:56 crc kubenswrapper[4693]: I1204 09:57:56.147940 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9x5m\" (UniqueName: \"kubernetes.io/projected/d7810505-b15b-4970-8cf0-f7217394a1ca-kube-api-access-l9x5m\") pod \"metallb-operator-webhook-server-5f644b44db-65csk\" (UID: \"d7810505-b15b-4970-8cf0-f7217394a1ca\") " pod="metallb-system/metallb-operator-webhook-server-5f644b44db-65csk" Dec 04 09:57:56 crc kubenswrapper[4693]: I1204 09:57:56.147990 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d7810505-b15b-4970-8cf0-f7217394a1ca-webhook-cert\") pod \"metallb-operator-webhook-server-5f644b44db-65csk\" (UID: \"d7810505-b15b-4970-8cf0-f7217394a1ca\") " pod="metallb-system/metallb-operator-webhook-server-5f644b44db-65csk" Dec 04 09:57:56 crc kubenswrapper[4693]: I1204 09:57:56.148020 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d7810505-b15b-4970-8cf0-f7217394a1ca-apiservice-cert\") pod \"metallb-operator-webhook-server-5f644b44db-65csk\" (UID: \"d7810505-b15b-4970-8cf0-f7217394a1ca\") " pod="metallb-system/metallb-operator-webhook-server-5f644b44db-65csk" Dec 04 09:57:56 crc kubenswrapper[4693]: I1204 09:57:56.254885 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d7810505-b15b-4970-8cf0-f7217394a1ca-webhook-cert\") pod \"metallb-operator-webhook-server-5f644b44db-65csk\" (UID: \"d7810505-b15b-4970-8cf0-f7217394a1ca\") " pod="metallb-system/metallb-operator-webhook-server-5f644b44db-65csk" Dec 04 09:57:56 crc kubenswrapper[4693]: I1204 09:57:56.261555 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d7810505-b15b-4970-8cf0-f7217394a1ca-apiservice-cert\") pod \"metallb-operator-webhook-server-5f644b44db-65csk\" (UID: \"d7810505-b15b-4970-8cf0-f7217394a1ca\") " pod="metallb-system/metallb-operator-webhook-server-5f644b44db-65csk" Dec 04 09:57:56 crc kubenswrapper[4693]: I1204 09:57:56.261789 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9x5m\" (UniqueName: \"kubernetes.io/projected/d7810505-b15b-4970-8cf0-f7217394a1ca-kube-api-access-l9x5m\") pod \"metallb-operator-webhook-server-5f644b44db-65csk\" (UID: \"d7810505-b15b-4970-8cf0-f7217394a1ca\") " pod="metallb-system/metallb-operator-webhook-server-5f644b44db-65csk" Dec 04 09:57:56 crc kubenswrapper[4693]: I1204 09:57:56.264388 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d7810505-b15b-4970-8cf0-f7217394a1ca-apiservice-cert\") pod \"metallb-operator-webhook-server-5f644b44db-65csk\" (UID: \"d7810505-b15b-4970-8cf0-f7217394a1ca\") " pod="metallb-system/metallb-operator-webhook-server-5f644b44db-65csk" Dec 04 09:57:56 crc kubenswrapper[4693]: I1204 09:57:56.276413 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d7810505-b15b-4970-8cf0-f7217394a1ca-webhook-cert\") pod \"metallb-operator-webhook-server-5f644b44db-65csk\" (UID: \"d7810505-b15b-4970-8cf0-f7217394a1ca\") " pod="metallb-system/metallb-operator-webhook-server-5f644b44db-65csk" Dec 04 09:57:56 crc kubenswrapper[4693]: I1204 09:57:56.316148 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9x5m\" (UniqueName: \"kubernetes.io/projected/d7810505-b15b-4970-8cf0-f7217394a1ca-kube-api-access-l9x5m\") pod \"metallb-operator-webhook-server-5f644b44db-65csk\" (UID: \"d7810505-b15b-4970-8cf0-f7217394a1ca\") " pod="metallb-system/metallb-operator-webhook-server-5f644b44db-65csk" Dec 04 09:57:56 crc kubenswrapper[4693]: I1204 09:57:56.389734 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-cfc67b4f5-6t7cx"] Dec 04 09:57:56 crc kubenswrapper[4693]: I1204 09:57:56.459852 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5f644b44db-65csk" Dec 04 09:57:56 crc kubenswrapper[4693]: I1204 09:57:56.676778 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5f644b44db-65csk"] Dec 04 09:57:57 crc kubenswrapper[4693]: I1204 09:57:57.102139 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5f644b44db-65csk" event={"ID":"d7810505-b15b-4970-8cf0-f7217394a1ca","Type":"ContainerStarted","Data":"a5e5eaf789c85d3fc15fe6fba00e2067677deb5a04d4da72deca36be14424d56"} Dec 04 09:57:57 crc kubenswrapper[4693]: I1204 09:57:57.103156 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cfc67b4f5-6t7cx" event={"ID":"990d09c4-a9e4-4233-ae12-3910f2937270","Type":"ContainerStarted","Data":"da719e8a0fc3160065939fe78a67d571e767de6a78cb21b00412b7724420ce4d"} Dec 04 09:58:02 crc kubenswrapper[4693]: I1204 09:58:02.139129 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5f644b44db-65csk" event={"ID":"d7810505-b15b-4970-8cf0-f7217394a1ca","Type":"ContainerStarted","Data":"7db9ca3e2cfcf315840c418e7dcc65573b383dc2a70e12bf8fa6eea9c5a77e5c"} Dec 04 09:58:02 crc kubenswrapper[4693]: I1204 09:58:02.139914 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5f644b44db-65csk" Dec 04 09:58:02 crc kubenswrapper[4693]: I1204 09:58:02.140562 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cfc67b4f5-6t7cx" event={"ID":"990d09c4-a9e4-4233-ae12-3910f2937270","Type":"ContainerStarted","Data":"2fafdc13b9abbdab8ee2d1f8ee1366324fa9c4901e347acbdaa4c0964fba439c"} Dec 04 09:58:02 crc kubenswrapper[4693]: I1204 09:58:02.140730 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-cfc67b4f5-6t7cx" Dec 04 09:58:02 crc kubenswrapper[4693]: I1204 09:58:02.162864 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5f644b44db-65csk" podStartSLOduration=1.424523974 podStartE2EDuration="6.162842696s" podCreationTimestamp="2025-12-04 09:57:56 +0000 UTC" firstStartedPulling="2025-12-04 09:57:56.694040352 +0000 UTC m=+922.591634105" lastFinishedPulling="2025-12-04 09:58:01.432359074 +0000 UTC m=+927.329952827" observedRunningTime="2025-12-04 09:58:02.154547787 +0000 UTC m=+928.052141560" watchObservedRunningTime="2025-12-04 09:58:02.162842696 +0000 UTC m=+928.060436449" Dec 04 09:58:02 crc kubenswrapper[4693]: I1204 09:58:02.184251 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-cfc67b4f5-6t7cx" podStartSLOduration=2.174781786 podStartE2EDuration="7.184226889s" podCreationTimestamp="2025-12-04 09:57:55 +0000 UTC" firstStartedPulling="2025-12-04 09:57:56.39651256 +0000 UTC m=+922.294106313" lastFinishedPulling="2025-12-04 09:58:01.405957663 +0000 UTC m=+927.303551416" observedRunningTime="2025-12-04 09:58:02.178215092 +0000 UTC m=+928.075808845" watchObservedRunningTime="2025-12-04 09:58:02.184226889 +0000 UTC m=+928.081820682" Dec 04 09:58:16 crc kubenswrapper[4693]: I1204 09:58:16.467611 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5f644b44db-65csk" Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.079237 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-cfc67b4f5-6t7cx" Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.820713 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-rc5g6"] Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.824945 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-qdqd6"] Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.825617 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qdqd6" Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.826323 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.832315 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.835384 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.835506 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-sq46x" Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.835687 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.854612 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-qdqd6"] Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.898061 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1455f336-f228-46d6-b944-4c76aa652335-reloader\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.898130 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1455f336-f228-46d6-b944-4c76aa652335-frr-sockets\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.898157 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1455f336-f228-46d6-b944-4c76aa652335-frr-conf\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.898197 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f57bca9c-8036-4ef7-8c6c-dcaa42b44c3a-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-qdqd6\" (UID: \"f57bca9c-8036-4ef7-8c6c-dcaa42b44c3a\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qdqd6" Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.898221 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1455f336-f228-46d6-b944-4c76aa652335-metrics-certs\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.898264 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1455f336-f228-46d6-b944-4c76aa652335-frr-startup\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.898289 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7t27\" (UniqueName: \"kubernetes.io/projected/1455f336-f228-46d6-b944-4c76aa652335-kube-api-access-h7t27\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.898318 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkxrx\" (UniqueName: \"kubernetes.io/projected/f57bca9c-8036-4ef7-8c6c-dcaa42b44c3a-kube-api-access-kkxrx\") pod \"frr-k8s-webhook-server-7fcb986d4-qdqd6\" (UID: \"f57bca9c-8036-4ef7-8c6c-dcaa42b44c3a\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qdqd6" Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.898371 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1455f336-f228-46d6-b944-4c76aa652335-metrics\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.990116 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-f7pnf"] Dec 04 09:58:36 crc kubenswrapper[4693]: I1204 09:58:36.991837 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-f7pnf" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.000450 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1455f336-f228-46d6-b944-4c76aa652335-reloader\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.000684 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1455f336-f228-46d6-b944-4c76aa652335-frr-sockets\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.000760 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1455f336-f228-46d6-b944-4c76aa652335-frr-conf\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.000842 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f57bca9c-8036-4ef7-8c6c-dcaa42b44c3a-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-qdqd6\" (UID: \"f57bca9c-8036-4ef7-8c6c-dcaa42b44c3a\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qdqd6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.000916 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1455f336-f228-46d6-b944-4c76aa652335-metrics-certs\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.001004 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1455f336-f228-46d6-b944-4c76aa652335-frr-startup\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.001076 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7t27\" (UniqueName: \"kubernetes.io/projected/1455f336-f228-46d6-b944-4c76aa652335-kube-api-access-h7t27\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.001148 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkxrx\" (UniqueName: \"kubernetes.io/projected/f57bca9c-8036-4ef7-8c6c-dcaa42b44c3a-kube-api-access-kkxrx\") pod \"frr-k8s-webhook-server-7fcb986d4-qdqd6\" (UID: \"f57bca9c-8036-4ef7-8c6c-dcaa42b44c3a\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qdqd6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.001231 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1455f336-f228-46d6-b944-4c76aa652335-metrics\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.001858 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1455f336-f228-46d6-b944-4c76aa652335-frr-sockets\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.004066 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 04 09:58:37 crc kubenswrapper[4693]: E1204 09:58:37.004080 4693 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 04 09:58:37 crc kubenswrapper[4693]: E1204 09:58:37.004160 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1455f336-f228-46d6-b944-4c76aa652335-metrics-certs podName:1455f336-f228-46d6-b944-4c76aa652335 nodeName:}" failed. No retries permitted until 2025-12-04 09:58:37.504134223 +0000 UTC m=+963.401727976 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1455f336-f228-46d6-b944-4c76aa652335-metrics-certs") pod "frr-k8s-rc5g6" (UID: "1455f336-f228-46d6-b944-4c76aa652335") : secret "frr-k8s-certs-secret" not found Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.004291 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-pk74h" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.004387 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.004481 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.005988 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1455f336-f228-46d6-b944-4c76aa652335-frr-startup\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.006289 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1455f336-f228-46d6-b944-4c76aa652335-metrics\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.008671 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1455f336-f228-46d6-b944-4c76aa652335-frr-conf\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.008717 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1455f336-f228-46d6-b944-4c76aa652335-reloader\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.017301 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f57bca9c-8036-4ef7-8c6c-dcaa42b44c3a-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-qdqd6\" (UID: \"f57bca9c-8036-4ef7-8c6c-dcaa42b44c3a\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qdqd6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.023055 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-tvgv5"] Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.024224 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-tvgv5" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.025151 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7t27\" (UniqueName: \"kubernetes.io/projected/1455f336-f228-46d6-b944-4c76aa652335-kube-api-access-h7t27\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.029055 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.031722 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-tvgv5"] Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.037185 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkxrx\" (UniqueName: \"kubernetes.io/projected/f57bca9c-8036-4ef7-8c6c-dcaa42b44c3a-kube-api-access-kkxrx\") pod \"frr-k8s-webhook-server-7fcb986d4-qdqd6\" (UID: \"f57bca9c-8036-4ef7-8c6c-dcaa42b44c3a\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qdqd6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.103077 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05ad9856-cd9f-4317-8c24-ebfa61baa56b-metrics-certs\") pod \"controller-f8648f98b-tvgv5\" (UID: \"05ad9856-cd9f-4317-8c24-ebfa61baa56b\") " pod="metallb-system/controller-f8648f98b-tvgv5" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.103175 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05ad9856-cd9f-4317-8c24-ebfa61baa56b-cert\") pod \"controller-f8648f98b-tvgv5\" (UID: \"05ad9856-cd9f-4317-8c24-ebfa61baa56b\") " pod="metallb-system/controller-f8648f98b-tvgv5" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.103237 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8cb6de38-296b-415b-8f7c-aa037586a5db-memberlist\") pod \"speaker-f7pnf\" (UID: \"8cb6de38-296b-415b-8f7c-aa037586a5db\") " pod="metallb-system/speaker-f7pnf" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.103271 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8cb6de38-296b-415b-8f7c-aa037586a5db-metallb-excludel2\") pod \"speaker-f7pnf\" (UID: \"8cb6de38-296b-415b-8f7c-aa037586a5db\") " pod="metallb-system/speaker-f7pnf" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.103457 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckcfh\" (UniqueName: \"kubernetes.io/projected/8cb6de38-296b-415b-8f7c-aa037586a5db-kube-api-access-ckcfh\") pod \"speaker-f7pnf\" (UID: \"8cb6de38-296b-415b-8f7c-aa037586a5db\") " pod="metallb-system/speaker-f7pnf" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.103580 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8cb6de38-296b-415b-8f7c-aa037586a5db-metrics-certs\") pod \"speaker-f7pnf\" (UID: \"8cb6de38-296b-415b-8f7c-aa037586a5db\") " pod="metallb-system/speaker-f7pnf" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.103611 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vdqq\" (UniqueName: \"kubernetes.io/projected/05ad9856-cd9f-4317-8c24-ebfa61baa56b-kube-api-access-6vdqq\") pod \"controller-f8648f98b-tvgv5\" (UID: \"05ad9856-cd9f-4317-8c24-ebfa61baa56b\") " pod="metallb-system/controller-f8648f98b-tvgv5" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.145580 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qdqd6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.204875 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8cb6de38-296b-415b-8f7c-aa037586a5db-memberlist\") pod \"speaker-f7pnf\" (UID: \"8cb6de38-296b-415b-8f7c-aa037586a5db\") " pod="metallb-system/speaker-f7pnf" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.205246 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8cb6de38-296b-415b-8f7c-aa037586a5db-metallb-excludel2\") pod \"speaker-f7pnf\" (UID: \"8cb6de38-296b-415b-8f7c-aa037586a5db\") " pod="metallb-system/speaker-f7pnf" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.205379 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckcfh\" (UniqueName: \"kubernetes.io/projected/8cb6de38-296b-415b-8f7c-aa037586a5db-kube-api-access-ckcfh\") pod \"speaker-f7pnf\" (UID: \"8cb6de38-296b-415b-8f7c-aa037586a5db\") " pod="metallb-system/speaker-f7pnf" Dec 04 09:58:37 crc kubenswrapper[4693]: E1204 09:58:37.205074 4693 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.205657 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8cb6de38-296b-415b-8f7c-aa037586a5db-metrics-certs\") pod \"speaker-f7pnf\" (UID: \"8cb6de38-296b-415b-8f7c-aa037586a5db\") " pod="metallb-system/speaker-f7pnf" Dec 04 09:58:37 crc kubenswrapper[4693]: E1204 09:58:37.205671 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cb6de38-296b-415b-8f7c-aa037586a5db-memberlist podName:8cb6de38-296b-415b-8f7c-aa037586a5db nodeName:}" failed. No retries permitted until 2025-12-04 09:58:37.705640909 +0000 UTC m=+963.603234882 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8cb6de38-296b-415b-8f7c-aa037586a5db-memberlist") pod "speaker-f7pnf" (UID: "8cb6de38-296b-415b-8f7c-aa037586a5db") : secret "metallb-memberlist" not found Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.205723 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vdqq\" (UniqueName: \"kubernetes.io/projected/05ad9856-cd9f-4317-8c24-ebfa61baa56b-kube-api-access-6vdqq\") pod \"controller-f8648f98b-tvgv5\" (UID: \"05ad9856-cd9f-4317-8c24-ebfa61baa56b\") " pod="metallb-system/controller-f8648f98b-tvgv5" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.205760 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05ad9856-cd9f-4317-8c24-ebfa61baa56b-metrics-certs\") pod \"controller-f8648f98b-tvgv5\" (UID: \"05ad9856-cd9f-4317-8c24-ebfa61baa56b\") " pod="metallb-system/controller-f8648f98b-tvgv5" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.205933 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05ad9856-cd9f-4317-8c24-ebfa61baa56b-cert\") pod \"controller-f8648f98b-tvgv5\" (UID: \"05ad9856-cd9f-4317-8c24-ebfa61baa56b\") " pod="metallb-system/controller-f8648f98b-tvgv5" Dec 04 09:58:37 crc kubenswrapper[4693]: E1204 09:58:37.206022 4693 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 04 09:58:37 crc kubenswrapper[4693]: E1204 09:58:37.206097 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05ad9856-cd9f-4317-8c24-ebfa61baa56b-metrics-certs podName:05ad9856-cd9f-4317-8c24-ebfa61baa56b nodeName:}" failed. No retries permitted until 2025-12-04 09:58:37.706076721 +0000 UTC m=+963.603670474 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/05ad9856-cd9f-4317-8c24-ebfa61baa56b-metrics-certs") pod "controller-f8648f98b-tvgv5" (UID: "05ad9856-cd9f-4317-8c24-ebfa61baa56b") : secret "controller-certs-secret" not found Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.206109 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8cb6de38-296b-415b-8f7c-aa037586a5db-metallb-excludel2\") pod \"speaker-f7pnf\" (UID: \"8cb6de38-296b-415b-8f7c-aa037586a5db\") " pod="metallb-system/speaker-f7pnf" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.209172 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.211310 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8cb6de38-296b-415b-8f7c-aa037586a5db-metrics-certs\") pod \"speaker-f7pnf\" (UID: \"8cb6de38-296b-415b-8f7c-aa037586a5db\") " pod="metallb-system/speaker-f7pnf" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.220908 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05ad9856-cd9f-4317-8c24-ebfa61baa56b-cert\") pod \"controller-f8648f98b-tvgv5\" (UID: \"05ad9856-cd9f-4317-8c24-ebfa61baa56b\") " pod="metallb-system/controller-f8648f98b-tvgv5" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.226080 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vdqq\" (UniqueName: \"kubernetes.io/projected/05ad9856-cd9f-4317-8c24-ebfa61baa56b-kube-api-access-6vdqq\") pod \"controller-f8648f98b-tvgv5\" (UID: \"05ad9856-cd9f-4317-8c24-ebfa61baa56b\") " pod="metallb-system/controller-f8648f98b-tvgv5" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.228666 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckcfh\" (UniqueName: \"kubernetes.io/projected/8cb6de38-296b-415b-8f7c-aa037586a5db-kube-api-access-ckcfh\") pod \"speaker-f7pnf\" (UID: \"8cb6de38-296b-415b-8f7c-aa037586a5db\") " pod="metallb-system/speaker-f7pnf" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.510636 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1455f336-f228-46d6-b944-4c76aa652335-metrics-certs\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.515841 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1455f336-f228-46d6-b944-4c76aa652335-metrics-certs\") pod \"frr-k8s-rc5g6\" (UID: \"1455f336-f228-46d6-b944-4c76aa652335\") " pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.605526 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-qdqd6"] Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.713284 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8cb6de38-296b-415b-8f7c-aa037586a5db-memberlist\") pod \"speaker-f7pnf\" (UID: \"8cb6de38-296b-415b-8f7c-aa037586a5db\") " pod="metallb-system/speaker-f7pnf" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.713396 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05ad9856-cd9f-4317-8c24-ebfa61baa56b-metrics-certs\") pod \"controller-f8648f98b-tvgv5\" (UID: \"05ad9856-cd9f-4317-8c24-ebfa61baa56b\") " pod="metallb-system/controller-f8648f98b-tvgv5" Dec 04 09:58:37 crc kubenswrapper[4693]: E1204 09:58:37.713504 4693 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 04 09:58:37 crc kubenswrapper[4693]: E1204 09:58:37.713606 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cb6de38-296b-415b-8f7c-aa037586a5db-memberlist podName:8cb6de38-296b-415b-8f7c-aa037586a5db nodeName:}" failed. No retries permitted until 2025-12-04 09:58:38.713584894 +0000 UTC m=+964.611178647 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8cb6de38-296b-415b-8f7c-aa037586a5db-memberlist") pod "speaker-f7pnf" (UID: "8cb6de38-296b-415b-8f7c-aa037586a5db") : secret "metallb-memberlist" not found Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.717495 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05ad9856-cd9f-4317-8c24-ebfa61baa56b-metrics-certs\") pod \"controller-f8648f98b-tvgv5\" (UID: \"05ad9856-cd9f-4317-8c24-ebfa61baa56b\") " pod="metallb-system/controller-f8648f98b-tvgv5" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.757754 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:37 crc kubenswrapper[4693]: I1204 09:58:37.984640 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-tvgv5" Dec 04 09:58:38 crc kubenswrapper[4693]: I1204 09:58:38.174796 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-tvgv5"] Dec 04 09:58:38 crc kubenswrapper[4693]: I1204 09:58:38.360983 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rc5g6" event={"ID":"1455f336-f228-46d6-b944-4c76aa652335","Type":"ContainerStarted","Data":"f39e46503d4ffbd3052d1c02a32d9c1dadf1d92c92266666d3d4b758342a5258"} Dec 04 09:58:38 crc kubenswrapper[4693]: I1204 09:58:38.361930 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qdqd6" event={"ID":"f57bca9c-8036-4ef7-8c6c-dcaa42b44c3a","Type":"ContainerStarted","Data":"c5cd320225d5d5ca7b787b21430c0f48d9fbfb1f8ad2205848953c0b91e779d6"} Dec 04 09:58:38 crc kubenswrapper[4693]: I1204 09:58:38.362994 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-tvgv5" event={"ID":"05ad9856-cd9f-4317-8c24-ebfa61baa56b","Type":"ContainerStarted","Data":"4eac09362360bb4a679925c9c4982e4faaf3727bfc0fe61e59a62b14f9bfe151"} Dec 04 09:58:38 crc kubenswrapper[4693]: I1204 09:58:38.727110 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8cb6de38-296b-415b-8f7c-aa037586a5db-memberlist\") pod \"speaker-f7pnf\" (UID: \"8cb6de38-296b-415b-8f7c-aa037586a5db\") " pod="metallb-system/speaker-f7pnf" Dec 04 09:58:38 crc kubenswrapper[4693]: I1204 09:58:38.733628 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8cb6de38-296b-415b-8f7c-aa037586a5db-memberlist\") pod \"speaker-f7pnf\" (UID: \"8cb6de38-296b-415b-8f7c-aa037586a5db\") " pod="metallb-system/speaker-f7pnf" Dec 04 09:58:38 crc kubenswrapper[4693]: I1204 09:58:38.809098 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-f7pnf" Dec 04 09:58:38 crc kubenswrapper[4693]: W1204 09:58:38.864515 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cb6de38_296b_415b_8f7c_aa037586a5db.slice/crio-8f24978a24b09df1daa34f49fa24585a3c59930f21ebf839cecf1865c4beaa7b WatchSource:0}: Error finding container 8f24978a24b09df1daa34f49fa24585a3c59930f21ebf839cecf1865c4beaa7b: Status 404 returned error can't find the container with id 8f24978a24b09df1daa34f49fa24585a3c59930f21ebf839cecf1865c4beaa7b Dec 04 09:58:39 crc kubenswrapper[4693]: I1204 09:58:39.385296 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-f7pnf" event={"ID":"8cb6de38-296b-415b-8f7c-aa037586a5db","Type":"ContainerStarted","Data":"3da28a0185a34a9965aa86a9b827c8b2e9322cbf57665b9d530c80f042ab2666"} Dec 04 09:58:39 crc kubenswrapper[4693]: I1204 09:58:39.385380 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-f7pnf" event={"ID":"8cb6de38-296b-415b-8f7c-aa037586a5db","Type":"ContainerStarted","Data":"8f24978a24b09df1daa34f49fa24585a3c59930f21ebf839cecf1865c4beaa7b"} Dec 04 09:58:39 crc kubenswrapper[4693]: I1204 09:58:39.388659 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-tvgv5" event={"ID":"05ad9856-cd9f-4317-8c24-ebfa61baa56b","Type":"ContainerStarted","Data":"0598dbb0ca7981c12bb1c06932d4c61a3a5e285e6d244bb7a626c09814c3f24f"} Dec 04 09:58:39 crc kubenswrapper[4693]: I1204 09:58:39.388693 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-tvgv5" event={"ID":"05ad9856-cd9f-4317-8c24-ebfa61baa56b","Type":"ContainerStarted","Data":"1b745293ac449ac002ed7d8da9ebc2b06832e6d6945222d0161984cdbf72bd89"} Dec 04 09:58:39 crc kubenswrapper[4693]: I1204 09:58:39.389625 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-tvgv5" Dec 04 09:58:39 crc kubenswrapper[4693]: I1204 09:58:39.412324 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-tvgv5" podStartSLOduration=3.412306078 podStartE2EDuration="3.412306078s" podCreationTimestamp="2025-12-04 09:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:58:39.409956644 +0000 UTC m=+965.307550397" watchObservedRunningTime="2025-12-04 09:58:39.412306078 +0000 UTC m=+965.309899831" Dec 04 09:58:40 crc kubenswrapper[4693]: I1204 09:58:40.399176 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-f7pnf" event={"ID":"8cb6de38-296b-415b-8f7c-aa037586a5db","Type":"ContainerStarted","Data":"8127af58cadc5a07aeecb65b36e1f95eaf8956d1c88b4d2b814ce47d19381426"} Dec 04 09:58:40 crc kubenswrapper[4693]: I1204 09:58:40.426823 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-f7pnf" podStartSLOduration=4.42680466 podStartE2EDuration="4.42680466s" podCreationTimestamp="2025-12-04 09:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:58:40.425224186 +0000 UTC m=+966.322817939" watchObservedRunningTime="2025-12-04 09:58:40.42680466 +0000 UTC m=+966.324398413" Dec 04 09:58:41 crc kubenswrapper[4693]: I1204 09:58:41.578531 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-f7pnf" Dec 04 09:58:46 crc kubenswrapper[4693]: I1204 09:58:46.605324 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qdqd6" event={"ID":"f57bca9c-8036-4ef7-8c6c-dcaa42b44c3a","Type":"ContainerStarted","Data":"2aa7e3ec24d1d43469f07682ba5a08c1bec42e4451205d087b7629ce4a30e7d7"} Dec 04 09:58:46 crc kubenswrapper[4693]: I1204 09:58:46.605973 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qdqd6" Dec 04 09:58:46 crc kubenswrapper[4693]: I1204 09:58:46.607309 4693 generic.go:334] "Generic (PLEG): container finished" podID="1455f336-f228-46d6-b944-4c76aa652335" containerID="a71c21fc6717f8e946b8cfb8d45356861fcb195352e2fc6a397dd37fc83813f9" exitCode=0 Dec 04 09:58:46 crc kubenswrapper[4693]: I1204 09:58:46.607399 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rc5g6" event={"ID":"1455f336-f228-46d6-b944-4c76aa652335","Type":"ContainerDied","Data":"a71c21fc6717f8e946b8cfb8d45356861fcb195352e2fc6a397dd37fc83813f9"} Dec 04 09:58:46 crc kubenswrapper[4693]: I1204 09:58:46.634638 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qdqd6" podStartSLOduration=3.5826916730000002 podStartE2EDuration="10.634614813s" podCreationTimestamp="2025-12-04 09:58:36 +0000 UTC" firstStartedPulling="2025-12-04 09:58:37.613853144 +0000 UTC m=+963.511446907" lastFinishedPulling="2025-12-04 09:58:44.665776294 +0000 UTC m=+970.563370047" observedRunningTime="2025-12-04 09:58:46.628603546 +0000 UTC m=+972.526197319" watchObservedRunningTime="2025-12-04 09:58:46.634614813 +0000 UTC m=+972.532208576" Dec 04 09:58:47 crc kubenswrapper[4693]: I1204 09:58:47.616812 4693 generic.go:334] "Generic (PLEG): container finished" podID="1455f336-f228-46d6-b944-4c76aa652335" containerID="8690399895e37e71a4d93d46188271becc2339218e74611350ff89388c11ec66" exitCode=0 Dec 04 09:58:47 crc kubenswrapper[4693]: I1204 09:58:47.616891 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rc5g6" event={"ID":"1455f336-f228-46d6-b944-4c76aa652335","Type":"ContainerDied","Data":"8690399895e37e71a4d93d46188271becc2339218e74611350ff89388c11ec66"} Dec 04 09:58:48 crc kubenswrapper[4693]: I1204 09:58:48.624149 4693 generic.go:334] "Generic (PLEG): container finished" podID="1455f336-f228-46d6-b944-4c76aa652335" containerID="17624ff00d204fbef326533cddce0a0f1f0284e51aa8e9143f3c91ff66c439b6" exitCode=0 Dec 04 09:58:48 crc kubenswrapper[4693]: I1204 09:58:48.624195 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rc5g6" event={"ID":"1455f336-f228-46d6-b944-4c76aa652335","Type":"ContainerDied","Data":"17624ff00d204fbef326533cddce0a0f1f0284e51aa8e9143f3c91ff66c439b6"} Dec 04 09:58:49 crc kubenswrapper[4693]: I1204 09:58:49.646311 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rc5g6" event={"ID":"1455f336-f228-46d6-b944-4c76aa652335","Type":"ContainerStarted","Data":"2c6b4c0e4bb994af782653fe01b9e1f3e95e3deb7e9bc3baa13076b114b88daa"} Dec 04 09:58:49 crc kubenswrapper[4693]: I1204 09:58:49.646629 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rc5g6" event={"ID":"1455f336-f228-46d6-b944-4c76aa652335","Type":"ContainerStarted","Data":"5ac6df4eb3c22086f152d48fae9eb136b2afddce17b12490e37ccac690347e0e"} Dec 04 09:58:49 crc kubenswrapper[4693]: I1204 09:58:49.646642 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rc5g6" event={"ID":"1455f336-f228-46d6-b944-4c76aa652335","Type":"ContainerStarted","Data":"8f223527f7f72dea0a0bbac57010e6ad27ad215cb24182372684ae7bcca79d0e"} Dec 04 09:58:49 crc kubenswrapper[4693]: I1204 09:58:49.646653 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rc5g6" event={"ID":"1455f336-f228-46d6-b944-4c76aa652335","Type":"ContainerStarted","Data":"c575feead88b72d21c90f6066938159fcfe90a3d76201150b86a373d9f63f29a"} Dec 04 09:58:49 crc kubenswrapper[4693]: I1204 09:58:49.646663 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rc5g6" event={"ID":"1455f336-f228-46d6-b944-4c76aa652335","Type":"ContainerStarted","Data":"20263e9e931baa9d4a674e28875ba5394c250bad4b7a3edfab3716e2d02c3f55"} Dec 04 09:58:49 crc kubenswrapper[4693]: I1204 09:58:49.646673 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rc5g6" event={"ID":"1455f336-f228-46d6-b944-4c76aa652335","Type":"ContainerStarted","Data":"c4fadb6270cbd3400d06a2890c5bee7ba0917c455796989cf3557ee661dabda0"} Dec 04 09:58:49 crc kubenswrapper[4693]: I1204 09:58:49.646952 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:49 crc kubenswrapper[4693]: I1204 09:58:49.673914 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-rc5g6" podStartSLOduration=6.894590794 podStartE2EDuration="13.673898561s" podCreationTimestamp="2025-12-04 09:58:36 +0000 UTC" firstStartedPulling="2025-12-04 09:58:37.865038784 +0000 UTC m=+963.762632537" lastFinishedPulling="2025-12-04 09:58:44.644346551 +0000 UTC m=+970.541940304" observedRunningTime="2025-12-04 09:58:49.669194903 +0000 UTC m=+975.566788676" watchObservedRunningTime="2025-12-04 09:58:49.673898561 +0000 UTC m=+975.571492314" Dec 04 09:58:52 crc kubenswrapper[4693]: I1204 09:58:52.759026 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:52 crc kubenswrapper[4693]: I1204 09:58:52.826023 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:58:57 crc kubenswrapper[4693]: I1204 09:58:57.153621 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-qdqd6" Dec 04 09:58:57 crc kubenswrapper[4693]: I1204 09:58:57.990792 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-tvgv5" Dec 04 09:58:58 crc kubenswrapper[4693]: I1204 09:58:58.816256 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-f7pnf" Dec 04 09:59:01 crc kubenswrapper[4693]: I1204 09:59:01.768034 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qxjvz"] Dec 04 09:59:01 crc kubenswrapper[4693]: I1204 09:59:01.769265 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qxjvz" Dec 04 09:59:01 crc kubenswrapper[4693]: I1204 09:59:01.772028 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-n5stf" Dec 04 09:59:01 crc kubenswrapper[4693]: I1204 09:59:01.772098 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 04 09:59:01 crc kubenswrapper[4693]: I1204 09:59:01.777403 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 04 09:59:01 crc kubenswrapper[4693]: I1204 09:59:01.800706 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qxjvz"] Dec 04 09:59:01 crc kubenswrapper[4693]: I1204 09:59:01.855748 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxx7w\" (UniqueName: \"kubernetes.io/projected/756f0459-f880-44dc-8858-9e7dec8f28ee-kube-api-access-qxx7w\") pod \"openstack-operator-index-qxjvz\" (UID: \"756f0459-f880-44dc-8858-9e7dec8f28ee\") " pod="openstack-operators/openstack-operator-index-qxjvz" Dec 04 09:59:01 crc kubenswrapper[4693]: I1204 09:59:01.956964 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxx7w\" (UniqueName: \"kubernetes.io/projected/756f0459-f880-44dc-8858-9e7dec8f28ee-kube-api-access-qxx7w\") pod \"openstack-operator-index-qxjvz\" (UID: \"756f0459-f880-44dc-8858-9e7dec8f28ee\") " pod="openstack-operators/openstack-operator-index-qxjvz" Dec 04 09:59:01 crc kubenswrapper[4693]: I1204 09:59:01.978256 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxx7w\" (UniqueName: \"kubernetes.io/projected/756f0459-f880-44dc-8858-9e7dec8f28ee-kube-api-access-qxx7w\") pod \"openstack-operator-index-qxjvz\" (UID: \"756f0459-f880-44dc-8858-9e7dec8f28ee\") " pod="openstack-operators/openstack-operator-index-qxjvz" Dec 04 09:59:02 crc kubenswrapper[4693]: I1204 09:59:02.087127 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qxjvz" Dec 04 09:59:02 crc kubenswrapper[4693]: I1204 09:59:02.486875 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qxjvz"] Dec 04 09:59:02 crc kubenswrapper[4693]: I1204 09:59:02.747846 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qxjvz" event={"ID":"756f0459-f880-44dc-8858-9e7dec8f28ee","Type":"ContainerStarted","Data":"6b5219be272f506251bd5f80352022731908e7b06feb7a834f01fe7821aa3729"} Dec 04 09:59:05 crc kubenswrapper[4693]: I1204 09:59:05.149182 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-qxjvz"] Dec 04 09:59:05 crc kubenswrapper[4693]: I1204 09:59:05.758046 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-shhvj"] Dec 04 09:59:05 crc kubenswrapper[4693]: I1204 09:59:05.759001 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-shhvj" Dec 04 09:59:05 crc kubenswrapper[4693]: I1204 09:59:05.766079 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-shhvj"] Dec 04 09:59:05 crc kubenswrapper[4693]: I1204 09:59:05.804661 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2k6r\" (UniqueName: \"kubernetes.io/projected/f908e4a3-f7a8-4165-b9b1-8b25f43727e1-kube-api-access-d2k6r\") pod \"openstack-operator-index-shhvj\" (UID: \"f908e4a3-f7a8-4165-b9b1-8b25f43727e1\") " pod="openstack-operators/openstack-operator-index-shhvj" Dec 04 09:59:05 crc kubenswrapper[4693]: I1204 09:59:05.906293 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2k6r\" (UniqueName: \"kubernetes.io/projected/f908e4a3-f7a8-4165-b9b1-8b25f43727e1-kube-api-access-d2k6r\") pod \"openstack-operator-index-shhvj\" (UID: \"f908e4a3-f7a8-4165-b9b1-8b25f43727e1\") " pod="openstack-operators/openstack-operator-index-shhvj" Dec 04 09:59:05 crc kubenswrapper[4693]: I1204 09:59:05.929194 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2k6r\" (UniqueName: \"kubernetes.io/projected/f908e4a3-f7a8-4165-b9b1-8b25f43727e1-kube-api-access-d2k6r\") pod \"openstack-operator-index-shhvj\" (UID: \"f908e4a3-f7a8-4165-b9b1-8b25f43727e1\") " pod="openstack-operators/openstack-operator-index-shhvj" Dec 04 09:59:06 crc kubenswrapper[4693]: I1204 09:59:06.080887 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-shhvj" Dec 04 09:59:07 crc kubenswrapper[4693]: I1204 09:59:07.617753 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-shhvj"] Dec 04 09:59:07 crc kubenswrapper[4693]: W1204 09:59:07.629794 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf908e4a3_f7a8_4165_b9b1_8b25f43727e1.slice/crio-53c9d52f395d87e550ee299b6b29ac072665c73cdf276bb55771ebccda6762ea WatchSource:0}: Error finding container 53c9d52f395d87e550ee299b6b29ac072665c73cdf276bb55771ebccda6762ea: Status 404 returned error can't find the container with id 53c9d52f395d87e550ee299b6b29ac072665c73cdf276bb55771ebccda6762ea Dec 04 09:59:07 crc kubenswrapper[4693]: I1204 09:59:07.760262 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-rc5g6" Dec 04 09:59:07 crc kubenswrapper[4693]: I1204 09:59:07.809236 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-shhvj" event={"ID":"f908e4a3-f7a8-4165-b9b1-8b25f43727e1","Type":"ContainerStarted","Data":"53c9d52f395d87e550ee299b6b29ac072665c73cdf276bb55771ebccda6762ea"} Dec 04 09:59:08 crc kubenswrapper[4693]: I1204 09:59:08.815593 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-shhvj" event={"ID":"f908e4a3-f7a8-4165-b9b1-8b25f43727e1","Type":"ContainerStarted","Data":"a9f3b2bad0fdbaa85b31f7a8eb1f2edd87cf2d89cf5d91344654f2c3c753057c"} Dec 04 09:59:08 crc kubenswrapper[4693]: I1204 09:59:08.818377 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qxjvz" event={"ID":"756f0459-f880-44dc-8858-9e7dec8f28ee","Type":"ContainerStarted","Data":"d6fb248218f727d653c7962eda5796f968221682e8138ae5b8c6ed4dd16a7b0a"} Dec 04 09:59:08 crc kubenswrapper[4693]: I1204 09:59:08.818464 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-qxjvz" podUID="756f0459-f880-44dc-8858-9e7dec8f28ee" containerName="registry-server" containerID="cri-o://d6fb248218f727d653c7962eda5796f968221682e8138ae5b8c6ed4dd16a7b0a" gracePeriod=2 Dec 04 09:59:08 crc kubenswrapper[4693]: I1204 09:59:08.833104 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-shhvj" podStartSLOduration=3.170263688 podStartE2EDuration="3.833088677s" podCreationTimestamp="2025-12-04 09:59:05 +0000 UTC" firstStartedPulling="2025-12-04 09:59:07.631548991 +0000 UTC m=+993.529142744" lastFinishedPulling="2025-12-04 09:59:08.29437399 +0000 UTC m=+994.191967733" observedRunningTime="2025-12-04 09:59:08.830234049 +0000 UTC m=+994.727827812" watchObservedRunningTime="2025-12-04 09:59:08.833088677 +0000 UTC m=+994.730682430" Dec 04 09:59:08 crc kubenswrapper[4693]: I1204 09:59:08.847574 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qxjvz" podStartSLOduration=2.601719783 podStartE2EDuration="7.847558398s" podCreationTimestamp="2025-12-04 09:59:01 +0000 UTC" firstStartedPulling="2025-12-04 09:59:02.500825995 +0000 UTC m=+988.398419748" lastFinishedPulling="2025-12-04 09:59:07.74666461 +0000 UTC m=+993.644258363" observedRunningTime="2025-12-04 09:59:08.843633293 +0000 UTC m=+994.741227056" watchObservedRunningTime="2025-12-04 09:59:08.847558398 +0000 UTC m=+994.745152151" Dec 04 09:59:09 crc kubenswrapper[4693]: I1204 09:59:09.191096 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qxjvz" Dec 04 09:59:09 crc kubenswrapper[4693]: I1204 09:59:09.250324 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxx7w\" (UniqueName: \"kubernetes.io/projected/756f0459-f880-44dc-8858-9e7dec8f28ee-kube-api-access-qxx7w\") pod \"756f0459-f880-44dc-8858-9e7dec8f28ee\" (UID: \"756f0459-f880-44dc-8858-9e7dec8f28ee\") " Dec 04 09:59:09 crc kubenswrapper[4693]: I1204 09:59:09.255899 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/756f0459-f880-44dc-8858-9e7dec8f28ee-kube-api-access-qxx7w" (OuterVolumeSpecName: "kube-api-access-qxx7w") pod "756f0459-f880-44dc-8858-9e7dec8f28ee" (UID: "756f0459-f880-44dc-8858-9e7dec8f28ee"). InnerVolumeSpecName "kube-api-access-qxx7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:09 crc kubenswrapper[4693]: I1204 09:59:09.352253 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxx7w\" (UniqueName: \"kubernetes.io/projected/756f0459-f880-44dc-8858-9e7dec8f28ee-kube-api-access-qxx7w\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:09 crc kubenswrapper[4693]: I1204 09:59:09.827072 4693 generic.go:334] "Generic (PLEG): container finished" podID="756f0459-f880-44dc-8858-9e7dec8f28ee" containerID="d6fb248218f727d653c7962eda5796f968221682e8138ae5b8c6ed4dd16a7b0a" exitCode=0 Dec 04 09:59:09 crc kubenswrapper[4693]: I1204 09:59:09.827139 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qxjvz" event={"ID":"756f0459-f880-44dc-8858-9e7dec8f28ee","Type":"ContainerDied","Data":"d6fb248218f727d653c7962eda5796f968221682e8138ae5b8c6ed4dd16a7b0a"} Dec 04 09:59:09 crc kubenswrapper[4693]: I1204 09:59:09.827182 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qxjvz" event={"ID":"756f0459-f880-44dc-8858-9e7dec8f28ee","Type":"ContainerDied","Data":"6b5219be272f506251bd5f80352022731908e7b06feb7a834f01fe7821aa3729"} Dec 04 09:59:09 crc kubenswrapper[4693]: I1204 09:59:09.827198 4693 scope.go:117] "RemoveContainer" containerID="d6fb248218f727d653c7962eda5796f968221682e8138ae5b8c6ed4dd16a7b0a" Dec 04 09:59:09 crc kubenswrapper[4693]: I1204 09:59:09.827116 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qxjvz" Dec 04 09:59:09 crc kubenswrapper[4693]: I1204 09:59:09.845788 4693 scope.go:117] "RemoveContainer" containerID="d6fb248218f727d653c7962eda5796f968221682e8138ae5b8c6ed4dd16a7b0a" Dec 04 09:59:09 crc kubenswrapper[4693]: E1204 09:59:09.847099 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6fb248218f727d653c7962eda5796f968221682e8138ae5b8c6ed4dd16a7b0a\": container with ID starting with d6fb248218f727d653c7962eda5796f968221682e8138ae5b8c6ed4dd16a7b0a not found: ID does not exist" containerID="d6fb248218f727d653c7962eda5796f968221682e8138ae5b8c6ed4dd16a7b0a" Dec 04 09:59:09 crc kubenswrapper[4693]: I1204 09:59:09.847136 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6fb248218f727d653c7962eda5796f968221682e8138ae5b8c6ed4dd16a7b0a"} err="failed to get container status \"d6fb248218f727d653c7962eda5796f968221682e8138ae5b8c6ed4dd16a7b0a\": rpc error: code = NotFound desc = could not find container \"d6fb248218f727d653c7962eda5796f968221682e8138ae5b8c6ed4dd16a7b0a\": container with ID starting with d6fb248218f727d653c7962eda5796f968221682e8138ae5b8c6ed4dd16a7b0a not found: ID does not exist" Dec 04 09:59:09 crc kubenswrapper[4693]: I1204 09:59:09.856007 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-qxjvz"] Dec 04 09:59:09 crc kubenswrapper[4693]: I1204 09:59:09.859935 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-qxjvz"] Dec 04 09:59:10 crc kubenswrapper[4693]: I1204 09:59:10.475168 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="756f0459-f880-44dc-8858-9e7dec8f28ee" path="/var/lib/kubelet/pods/756f0459-f880-44dc-8858-9e7dec8f28ee/volumes" Dec 04 09:59:16 crc kubenswrapper[4693]: I1204 09:59:16.081433 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-shhvj" Dec 04 09:59:16 crc kubenswrapper[4693]: I1204 09:59:16.081980 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-shhvj" Dec 04 09:59:16 crc kubenswrapper[4693]: I1204 09:59:16.130161 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-shhvj" Dec 04 09:59:16 crc kubenswrapper[4693]: I1204 09:59:16.915856 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-shhvj" Dec 04 09:59:22 crc kubenswrapper[4693]: I1204 09:59:22.272875 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:59:22 crc kubenswrapper[4693]: I1204 09:59:22.273468 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:59:23 crc kubenswrapper[4693]: I1204 09:59:23.015273 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4"] Dec 04 09:59:23 crc kubenswrapper[4693]: E1204 09:59:23.015568 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756f0459-f880-44dc-8858-9e7dec8f28ee" containerName="registry-server" Dec 04 09:59:23 crc kubenswrapper[4693]: I1204 09:59:23.015581 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="756f0459-f880-44dc-8858-9e7dec8f28ee" containerName="registry-server" Dec 04 09:59:23 crc kubenswrapper[4693]: I1204 09:59:23.015678 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="756f0459-f880-44dc-8858-9e7dec8f28ee" containerName="registry-server" Dec 04 09:59:23 crc kubenswrapper[4693]: I1204 09:59:23.016488 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4" Dec 04 09:59:23 crc kubenswrapper[4693]: I1204 09:59:23.018792 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-dsdpr" Dec 04 09:59:23 crc kubenswrapper[4693]: I1204 09:59:23.058149 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4"] Dec 04 09:59:23 crc kubenswrapper[4693]: I1204 09:59:23.142320 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrmgv\" (UniqueName: \"kubernetes.io/projected/53398369-8904-43ae-be7b-ced663828e5e-kube-api-access-jrmgv\") pod \"14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4\" (UID: \"53398369-8904-43ae-be7b-ced663828e5e\") " pod="openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4" Dec 04 09:59:23 crc kubenswrapper[4693]: I1204 09:59:23.142402 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53398369-8904-43ae-be7b-ced663828e5e-bundle\") pod \"14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4\" (UID: \"53398369-8904-43ae-be7b-ced663828e5e\") " pod="openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4" Dec 04 09:59:23 crc kubenswrapper[4693]: I1204 09:59:23.142448 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53398369-8904-43ae-be7b-ced663828e5e-util\") pod \"14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4\" (UID: \"53398369-8904-43ae-be7b-ced663828e5e\") " pod="openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4" Dec 04 09:59:23 crc kubenswrapper[4693]: I1204 09:59:23.243602 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrmgv\" (UniqueName: \"kubernetes.io/projected/53398369-8904-43ae-be7b-ced663828e5e-kube-api-access-jrmgv\") pod \"14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4\" (UID: \"53398369-8904-43ae-be7b-ced663828e5e\") " pod="openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4" Dec 04 09:59:23 crc kubenswrapper[4693]: I1204 09:59:23.243666 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53398369-8904-43ae-be7b-ced663828e5e-bundle\") pod \"14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4\" (UID: \"53398369-8904-43ae-be7b-ced663828e5e\") " pod="openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4" Dec 04 09:59:23 crc kubenswrapper[4693]: I1204 09:59:23.243709 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53398369-8904-43ae-be7b-ced663828e5e-util\") pod \"14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4\" (UID: \"53398369-8904-43ae-be7b-ced663828e5e\") " pod="openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4" Dec 04 09:59:23 crc kubenswrapper[4693]: I1204 09:59:23.244176 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53398369-8904-43ae-be7b-ced663828e5e-util\") pod \"14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4\" (UID: \"53398369-8904-43ae-be7b-ced663828e5e\") " pod="openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4" Dec 04 09:59:23 crc kubenswrapper[4693]: I1204 09:59:23.244390 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53398369-8904-43ae-be7b-ced663828e5e-bundle\") pod \"14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4\" (UID: \"53398369-8904-43ae-be7b-ced663828e5e\") " pod="openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4" Dec 04 09:59:23 crc kubenswrapper[4693]: I1204 09:59:23.262071 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrmgv\" (UniqueName: \"kubernetes.io/projected/53398369-8904-43ae-be7b-ced663828e5e-kube-api-access-jrmgv\") pod \"14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4\" (UID: \"53398369-8904-43ae-be7b-ced663828e5e\") " pod="openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4" Dec 04 09:59:23 crc kubenswrapper[4693]: I1204 09:59:23.338792 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4" Dec 04 09:59:23 crc kubenswrapper[4693]: I1204 09:59:23.812104 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4"] Dec 04 09:59:23 crc kubenswrapper[4693]: W1204 09:59:23.816444 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53398369_8904_43ae_be7b_ced663828e5e.slice/crio-0efce13a0498521052042a42c6fc6d01eeadc21be03460ffb12871a1ad611476 WatchSource:0}: Error finding container 0efce13a0498521052042a42c6fc6d01eeadc21be03460ffb12871a1ad611476: Status 404 returned error can't find the container with id 0efce13a0498521052042a42c6fc6d01eeadc21be03460ffb12871a1ad611476 Dec 04 09:59:23 crc kubenswrapper[4693]: I1204 09:59:23.917708 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4" event={"ID":"53398369-8904-43ae-be7b-ced663828e5e","Type":"ContainerStarted","Data":"0efce13a0498521052042a42c6fc6d01eeadc21be03460ffb12871a1ad611476"} Dec 04 09:59:24 crc kubenswrapper[4693]: I1204 09:59:24.925196 4693 generic.go:334] "Generic (PLEG): container finished" podID="53398369-8904-43ae-be7b-ced663828e5e" containerID="76a8b2c0d7d4c39fa43d4973923bfa9178adb30a241696276869fdab834c9875" exitCode=0 Dec 04 09:59:24 crc kubenswrapper[4693]: I1204 09:59:24.925304 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4" event={"ID":"53398369-8904-43ae-be7b-ced663828e5e","Type":"ContainerDied","Data":"76a8b2c0d7d4c39fa43d4973923bfa9178adb30a241696276869fdab834c9875"} Dec 04 09:59:27 crc kubenswrapper[4693]: I1204 09:59:27.945025 4693 generic.go:334] "Generic (PLEG): container finished" podID="53398369-8904-43ae-be7b-ced663828e5e" containerID="40ab3d23b88184e803755e28f5cfcd303864f9c4d3dfa68cc55550cdf17d4100" exitCode=0 Dec 04 09:59:27 crc kubenswrapper[4693]: I1204 09:59:27.945076 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4" event={"ID":"53398369-8904-43ae-be7b-ced663828e5e","Type":"ContainerDied","Data":"40ab3d23b88184e803755e28f5cfcd303864f9c4d3dfa68cc55550cdf17d4100"} Dec 04 09:59:28 crc kubenswrapper[4693]: I1204 09:59:28.955275 4693 generic.go:334] "Generic (PLEG): container finished" podID="53398369-8904-43ae-be7b-ced663828e5e" containerID="5de6a5479dc45644ba10a6b0c91ec613b69ce20794e04727a7a6617610306083" exitCode=0 Dec 04 09:59:28 crc kubenswrapper[4693]: I1204 09:59:28.955385 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4" event={"ID":"53398369-8904-43ae-be7b-ced663828e5e","Type":"ContainerDied","Data":"5de6a5479dc45644ba10a6b0c91ec613b69ce20794e04727a7a6617610306083"} Dec 04 09:59:30 crc kubenswrapper[4693]: I1204 09:59:30.190398 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4" Dec 04 09:59:30 crc kubenswrapper[4693]: I1204 09:59:30.292026 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrmgv\" (UniqueName: \"kubernetes.io/projected/53398369-8904-43ae-be7b-ced663828e5e-kube-api-access-jrmgv\") pod \"53398369-8904-43ae-be7b-ced663828e5e\" (UID: \"53398369-8904-43ae-be7b-ced663828e5e\") " Dec 04 09:59:30 crc kubenswrapper[4693]: I1204 09:59:30.292189 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53398369-8904-43ae-be7b-ced663828e5e-util\") pod \"53398369-8904-43ae-be7b-ced663828e5e\" (UID: \"53398369-8904-43ae-be7b-ced663828e5e\") " Dec 04 09:59:30 crc kubenswrapper[4693]: I1204 09:59:30.292228 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53398369-8904-43ae-be7b-ced663828e5e-bundle\") pod \"53398369-8904-43ae-be7b-ced663828e5e\" (UID: \"53398369-8904-43ae-be7b-ced663828e5e\") " Dec 04 09:59:30 crc kubenswrapper[4693]: I1204 09:59:30.292831 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53398369-8904-43ae-be7b-ced663828e5e-bundle" (OuterVolumeSpecName: "bundle") pod "53398369-8904-43ae-be7b-ced663828e5e" (UID: "53398369-8904-43ae-be7b-ced663828e5e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:59:30 crc kubenswrapper[4693]: I1204 09:59:30.296654 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53398369-8904-43ae-be7b-ced663828e5e-kube-api-access-jrmgv" (OuterVolumeSpecName: "kube-api-access-jrmgv") pod "53398369-8904-43ae-be7b-ced663828e5e" (UID: "53398369-8904-43ae-be7b-ced663828e5e"). InnerVolumeSpecName "kube-api-access-jrmgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:30 crc kubenswrapper[4693]: I1204 09:59:30.303102 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53398369-8904-43ae-be7b-ced663828e5e-util" (OuterVolumeSpecName: "util") pod "53398369-8904-43ae-be7b-ced663828e5e" (UID: "53398369-8904-43ae-be7b-ced663828e5e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:59:30 crc kubenswrapper[4693]: I1204 09:59:30.394379 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrmgv\" (UniqueName: \"kubernetes.io/projected/53398369-8904-43ae-be7b-ced663828e5e-kube-api-access-jrmgv\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:30 crc kubenswrapper[4693]: I1204 09:59:30.394411 4693 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53398369-8904-43ae-be7b-ced663828e5e-util\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:30 crc kubenswrapper[4693]: I1204 09:59:30.394421 4693 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53398369-8904-43ae-be7b-ced663828e5e-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:30 crc kubenswrapper[4693]: I1204 09:59:30.969199 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4" event={"ID":"53398369-8904-43ae-be7b-ced663828e5e","Type":"ContainerDied","Data":"0efce13a0498521052042a42c6fc6d01eeadc21be03460ffb12871a1ad611476"} Dec 04 09:59:30 crc kubenswrapper[4693]: I1204 09:59:30.969247 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0efce13a0498521052042a42c6fc6d01eeadc21be03460ffb12871a1ad611476" Dec 04 09:59:30 crc kubenswrapper[4693]: I1204 09:59:30.969290 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4" Dec 04 09:59:35 crc kubenswrapper[4693]: I1204 09:59:35.077791 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5959575f68-6dnpv"] Dec 04 09:59:35 crc kubenswrapper[4693]: E1204 09:59:35.078295 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53398369-8904-43ae-be7b-ced663828e5e" containerName="pull" Dec 04 09:59:35 crc kubenswrapper[4693]: I1204 09:59:35.078554 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="53398369-8904-43ae-be7b-ced663828e5e" containerName="pull" Dec 04 09:59:35 crc kubenswrapper[4693]: E1204 09:59:35.078566 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53398369-8904-43ae-be7b-ced663828e5e" containerName="util" Dec 04 09:59:35 crc kubenswrapper[4693]: I1204 09:59:35.078573 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="53398369-8904-43ae-be7b-ced663828e5e" containerName="util" Dec 04 09:59:35 crc kubenswrapper[4693]: E1204 09:59:35.078585 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53398369-8904-43ae-be7b-ced663828e5e" containerName="extract" Dec 04 09:59:35 crc kubenswrapper[4693]: I1204 09:59:35.078590 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="53398369-8904-43ae-be7b-ced663828e5e" containerName="extract" Dec 04 09:59:35 crc kubenswrapper[4693]: I1204 09:59:35.078696 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="53398369-8904-43ae-be7b-ced663828e5e" containerName="extract" Dec 04 09:59:35 crc kubenswrapper[4693]: I1204 09:59:35.079111 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5959575f68-6dnpv" Dec 04 09:59:35 crc kubenswrapper[4693]: I1204 09:59:35.080913 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-mb246" Dec 04 09:59:35 crc kubenswrapper[4693]: I1204 09:59:35.101060 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5959575f68-6dnpv"] Dec 04 09:59:35 crc kubenswrapper[4693]: I1204 09:59:35.162038 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvtg9\" (UniqueName: \"kubernetes.io/projected/4c440aca-49fc-4f5b-8890-d2b8c021febf-kube-api-access-fvtg9\") pod \"openstack-operator-controller-operator-5959575f68-6dnpv\" (UID: \"4c440aca-49fc-4f5b-8890-d2b8c021febf\") " pod="openstack-operators/openstack-operator-controller-operator-5959575f68-6dnpv" Dec 04 09:59:35 crc kubenswrapper[4693]: I1204 09:59:35.263418 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvtg9\" (UniqueName: \"kubernetes.io/projected/4c440aca-49fc-4f5b-8890-d2b8c021febf-kube-api-access-fvtg9\") pod \"openstack-operator-controller-operator-5959575f68-6dnpv\" (UID: \"4c440aca-49fc-4f5b-8890-d2b8c021febf\") " pod="openstack-operators/openstack-operator-controller-operator-5959575f68-6dnpv" Dec 04 09:59:35 crc kubenswrapper[4693]: I1204 09:59:35.298120 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvtg9\" (UniqueName: \"kubernetes.io/projected/4c440aca-49fc-4f5b-8890-d2b8c021febf-kube-api-access-fvtg9\") pod \"openstack-operator-controller-operator-5959575f68-6dnpv\" (UID: \"4c440aca-49fc-4f5b-8890-d2b8c021febf\") " pod="openstack-operators/openstack-operator-controller-operator-5959575f68-6dnpv" Dec 04 09:59:35 crc kubenswrapper[4693]: I1204 09:59:35.393761 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5959575f68-6dnpv" Dec 04 09:59:35 crc kubenswrapper[4693]: I1204 09:59:35.847185 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 09:59:35 crc kubenswrapper[4693]: I1204 09:59:35.848878 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5959575f68-6dnpv"] Dec 04 09:59:35 crc kubenswrapper[4693]: I1204 09:59:35.994474 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5959575f68-6dnpv" event={"ID":"4c440aca-49fc-4f5b-8890-d2b8c021febf","Type":"ContainerStarted","Data":"1656b2ea9a2ecf2203cb2356055eb0517d5851143957db19cc5f120dceeb22e9"} Dec 04 09:59:42 crc kubenswrapper[4693]: I1204 09:59:42.044359 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5959575f68-6dnpv" event={"ID":"4c440aca-49fc-4f5b-8890-d2b8c021febf","Type":"ContainerStarted","Data":"d0980a31343e2b02406fc7edc9f0c7928002689164f78b9268a550ccbeb565de"} Dec 04 09:59:42 crc kubenswrapper[4693]: I1204 09:59:42.044938 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5959575f68-6dnpv" Dec 04 09:59:42 crc kubenswrapper[4693]: I1204 09:59:42.074137 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5959575f68-6dnpv" podStartSLOduration=1.799884362 podStartE2EDuration="7.074115124s" podCreationTimestamp="2025-12-04 09:59:35 +0000 UTC" firstStartedPulling="2025-12-04 09:59:35.84690239 +0000 UTC m=+1021.744496133" lastFinishedPulling="2025-12-04 09:59:41.121133142 +0000 UTC m=+1027.018726895" observedRunningTime="2025-12-04 09:59:42.067857845 +0000 UTC m=+1027.965451598" watchObservedRunningTime="2025-12-04 09:59:42.074115124 +0000 UTC m=+1027.971708877" Dec 04 09:59:52 crc kubenswrapper[4693]: I1204 09:59:52.272834 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:59:52 crc kubenswrapper[4693]: I1204 09:59:52.274437 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:59:55 crc kubenswrapper[4693]: I1204 09:59:55.397028 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5959575f68-6dnpv" Dec 04 10:00:00 crc kubenswrapper[4693]: I1204 10:00:00.149372 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn"] Dec 04 10:00:00 crc kubenswrapper[4693]: I1204 10:00:00.150263 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn" Dec 04 10:00:00 crc kubenswrapper[4693]: I1204 10:00:00.152357 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 10:00:00 crc kubenswrapper[4693]: I1204 10:00:00.152713 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 10:00:00 crc kubenswrapper[4693]: I1204 10:00:00.160974 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn"] Dec 04 10:00:00 crc kubenswrapper[4693]: I1204 10:00:00.296776 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/274f76aa-470f-4b21-830f-650997de36c3-secret-volume\") pod \"collect-profiles-29414040-grzzn\" (UID: \"274f76aa-470f-4b21-830f-650997de36c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn" Dec 04 10:00:00 crc kubenswrapper[4693]: I1204 10:00:00.296844 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7vl2\" (UniqueName: \"kubernetes.io/projected/274f76aa-470f-4b21-830f-650997de36c3-kube-api-access-s7vl2\") pod \"collect-profiles-29414040-grzzn\" (UID: \"274f76aa-470f-4b21-830f-650997de36c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn" Dec 04 10:00:00 crc kubenswrapper[4693]: I1204 10:00:00.296861 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/274f76aa-470f-4b21-830f-650997de36c3-config-volume\") pod \"collect-profiles-29414040-grzzn\" (UID: \"274f76aa-470f-4b21-830f-650997de36c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn" Dec 04 10:00:00 crc kubenswrapper[4693]: I1204 10:00:00.398367 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/274f76aa-470f-4b21-830f-650997de36c3-secret-volume\") pod \"collect-profiles-29414040-grzzn\" (UID: \"274f76aa-470f-4b21-830f-650997de36c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn" Dec 04 10:00:00 crc kubenswrapper[4693]: I1204 10:00:00.398429 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7vl2\" (UniqueName: \"kubernetes.io/projected/274f76aa-470f-4b21-830f-650997de36c3-kube-api-access-s7vl2\") pod \"collect-profiles-29414040-grzzn\" (UID: \"274f76aa-470f-4b21-830f-650997de36c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn" Dec 04 10:00:00 crc kubenswrapper[4693]: I1204 10:00:00.398448 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/274f76aa-470f-4b21-830f-650997de36c3-config-volume\") pod \"collect-profiles-29414040-grzzn\" (UID: \"274f76aa-470f-4b21-830f-650997de36c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn" Dec 04 10:00:00 crc kubenswrapper[4693]: I1204 10:00:00.399259 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/274f76aa-470f-4b21-830f-650997de36c3-config-volume\") pod \"collect-profiles-29414040-grzzn\" (UID: \"274f76aa-470f-4b21-830f-650997de36c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn" Dec 04 10:00:00 crc kubenswrapper[4693]: I1204 10:00:00.405820 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/274f76aa-470f-4b21-830f-650997de36c3-secret-volume\") pod \"collect-profiles-29414040-grzzn\" (UID: \"274f76aa-470f-4b21-830f-650997de36c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn" Dec 04 10:00:00 crc kubenswrapper[4693]: I1204 10:00:00.415415 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7vl2\" (UniqueName: \"kubernetes.io/projected/274f76aa-470f-4b21-830f-650997de36c3-kube-api-access-s7vl2\") pod \"collect-profiles-29414040-grzzn\" (UID: \"274f76aa-470f-4b21-830f-650997de36c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn" Dec 04 10:00:00 crc kubenswrapper[4693]: I1204 10:00:00.472601 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn" Dec 04 10:00:00 crc kubenswrapper[4693]: I1204 10:00:00.650454 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn"] Dec 04 10:00:01 crc kubenswrapper[4693]: I1204 10:00:01.164446 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn" event={"ID":"274f76aa-470f-4b21-830f-650997de36c3","Type":"ContainerStarted","Data":"b1ae4eb4959369467400a7cc229bba38ce351e14d6587d8fa83b7f336a4d63b0"} Dec 04 10:00:01 crc kubenswrapper[4693]: I1204 10:00:01.164740 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn" event={"ID":"274f76aa-470f-4b21-830f-650997de36c3","Type":"ContainerStarted","Data":"534b78a1d49873f3fe2fcf013d4ecfa57e033211dbc68d5956dc6cfa3d206922"} Dec 04 10:00:02 crc kubenswrapper[4693]: I1204 10:00:02.171141 4693 generic.go:334] "Generic (PLEG): container finished" podID="274f76aa-470f-4b21-830f-650997de36c3" containerID="b1ae4eb4959369467400a7cc229bba38ce351e14d6587d8fa83b7f336a4d63b0" exitCode=0 Dec 04 10:00:02 crc kubenswrapper[4693]: I1204 10:00:02.171205 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn" event={"ID":"274f76aa-470f-4b21-830f-650997de36c3","Type":"ContainerDied","Data":"b1ae4eb4959369467400a7cc229bba38ce351e14d6587d8fa83b7f336a4d63b0"} Dec 04 10:00:03 crc kubenswrapper[4693]: I1204 10:00:03.401519 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn" Dec 04 10:00:03 crc kubenswrapper[4693]: I1204 10:00:03.541221 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/274f76aa-470f-4b21-830f-650997de36c3-secret-volume\") pod \"274f76aa-470f-4b21-830f-650997de36c3\" (UID: \"274f76aa-470f-4b21-830f-650997de36c3\") " Dec 04 10:00:03 crc kubenswrapper[4693]: I1204 10:00:03.541276 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/274f76aa-470f-4b21-830f-650997de36c3-config-volume\") pod \"274f76aa-470f-4b21-830f-650997de36c3\" (UID: \"274f76aa-470f-4b21-830f-650997de36c3\") " Dec 04 10:00:03 crc kubenswrapper[4693]: I1204 10:00:03.541312 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7vl2\" (UniqueName: \"kubernetes.io/projected/274f76aa-470f-4b21-830f-650997de36c3-kube-api-access-s7vl2\") pod \"274f76aa-470f-4b21-830f-650997de36c3\" (UID: \"274f76aa-470f-4b21-830f-650997de36c3\") " Dec 04 10:00:03 crc kubenswrapper[4693]: I1204 10:00:03.542032 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/274f76aa-470f-4b21-830f-650997de36c3-config-volume" (OuterVolumeSpecName: "config-volume") pod "274f76aa-470f-4b21-830f-650997de36c3" (UID: "274f76aa-470f-4b21-830f-650997de36c3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:00:03 crc kubenswrapper[4693]: I1204 10:00:03.546469 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/274f76aa-470f-4b21-830f-650997de36c3-kube-api-access-s7vl2" (OuterVolumeSpecName: "kube-api-access-s7vl2") pod "274f76aa-470f-4b21-830f-650997de36c3" (UID: "274f76aa-470f-4b21-830f-650997de36c3"). InnerVolumeSpecName "kube-api-access-s7vl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:00:03 crc kubenswrapper[4693]: I1204 10:00:03.546474 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274f76aa-470f-4b21-830f-650997de36c3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "274f76aa-470f-4b21-830f-650997de36c3" (UID: "274f76aa-470f-4b21-830f-650997de36c3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:03 crc kubenswrapper[4693]: I1204 10:00:03.643229 4693 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/274f76aa-470f-4b21-830f-650997de36c3-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:03 crc kubenswrapper[4693]: I1204 10:00:03.643269 4693 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/274f76aa-470f-4b21-830f-650997de36c3-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:03 crc kubenswrapper[4693]: I1204 10:00:03.643279 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7vl2\" (UniqueName: \"kubernetes.io/projected/274f76aa-470f-4b21-830f-650997de36c3-kube-api-access-s7vl2\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:04 crc kubenswrapper[4693]: I1204 10:00:04.184672 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn" event={"ID":"274f76aa-470f-4b21-830f-650997de36c3","Type":"ContainerDied","Data":"534b78a1d49873f3fe2fcf013d4ecfa57e033211dbc68d5956dc6cfa3d206922"} Dec 04 10:00:04 crc kubenswrapper[4693]: I1204 10:00:04.184713 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="534b78a1d49873f3fe2fcf013d4ecfa57e033211dbc68d5956dc6cfa3d206922" Dec 04 10:00:04 crc kubenswrapper[4693]: I1204 10:00:04.185086 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn" Dec 04 10:00:22 crc kubenswrapper[4693]: I1204 10:00:22.273164 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:00:22 crc kubenswrapper[4693]: I1204 10:00:22.273738 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:00:22 crc kubenswrapper[4693]: I1204 10:00:22.273782 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 10:00:22 crc kubenswrapper[4693]: I1204 10:00:22.274360 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa8175e0c93e033e33e0a667d8b04220a3901467b64921658c170a933445969a"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:00:22 crc kubenswrapper[4693]: I1204 10:00:22.274415 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://fa8175e0c93e033e33e0a667d8b04220a3901467b64921658c170a933445969a" gracePeriod=600 Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.007714 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-vtgdw"] Dec 04 10:00:23 crc kubenswrapper[4693]: E1204 10:00:23.008270 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274f76aa-470f-4b21-830f-650997de36c3" containerName="collect-profiles" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.008287 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="274f76aa-470f-4b21-830f-650997de36c3" containerName="collect-profiles" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.008466 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="274f76aa-470f-4b21-830f-650997de36c3" containerName="collect-profiles" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.009067 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vtgdw" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.011024 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-bgfgc" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.031499 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-vtgdw"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.035540 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-5j4cs"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.036543 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-5j4cs" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.040304 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-q8crl"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.041262 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q8crl" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.050749 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-grtk2" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.051083 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-sp6gd" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.061560 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-5j4cs"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.085454 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-q8crl"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.110210 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-tnzrv"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.111551 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-tnzrv" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.118018 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29brj\" (UniqueName: \"kubernetes.io/projected/5aa92828-abd4-4f89-9621-5e9830101fca-kube-api-access-29brj\") pod \"barbican-operator-controller-manager-7d9dfd778-vtgdw\" (UID: \"5aa92828-abd4-4f89-9621-5e9830101fca\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vtgdw" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.125158 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-98hrp" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.125878 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-jgmqz"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.127104 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-jgmqz" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.129417 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-fndk2" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.157056 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-tnzrv"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.168444 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-jgmqz"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.187392 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4gqg9"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.188447 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4gqg9" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.192164 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-j2v8p" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.210542 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-46wh4"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.211579 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.214775 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-wj2vf" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.215260 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.218895 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prkgl\" (UniqueName: \"kubernetes.io/projected/4ba18ef1-50c1-48d0-9d2e-3c83c65913ab-kube-api-access-prkgl\") pod \"heat-operator-controller-manager-5f64f6f8bb-jgmqz\" (UID: \"4ba18ef1-50c1-48d0-9d2e-3c83c65913ab\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-jgmqz" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.218943 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5m7r\" (UniqueName: \"kubernetes.io/projected/f3a27983-d919-48fb-a227-f6a45efef985-kube-api-access-c5m7r\") pod \"cinder-operator-controller-manager-859b6ccc6-5j4cs\" (UID: \"f3a27983-d919-48fb-a227-f6a45efef985\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-5j4cs" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.219000 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgmgd\" (UniqueName: \"kubernetes.io/projected/2bb25289-630f-46c3-96f0-b5ea8177f5d8-kube-api-access-tgmgd\") pod \"designate-operator-controller-manager-78b4bc895b-q8crl\" (UID: \"2bb25289-630f-46c3-96f0-b5ea8177f5d8\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q8crl" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.219027 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29brj\" (UniqueName: \"kubernetes.io/projected/5aa92828-abd4-4f89-9621-5e9830101fca-kube-api-access-29brj\") pod \"barbican-operator-controller-manager-7d9dfd778-vtgdw\" (UID: \"5aa92828-abd4-4f89-9621-5e9830101fca\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vtgdw" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.219046 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjr5r\" (UniqueName: \"kubernetes.io/projected/7fb21378-fa3f-41a2-a6da-80831acec23c-kube-api-access-pjr5r\") pod \"glance-operator-controller-manager-77987cd8cd-tnzrv\" (UID: \"7fb21378-fa3f-41a2-a6da-80831acec23c\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-tnzrv" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.223077 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4gqg9"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.233527 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-46wh4"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.241974 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-qcc4g"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.243293 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qcc4g" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.251442 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-gtwgq"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.252520 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gtwgq" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.254530 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-h4p9c" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.255691 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-4t7nm" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.266988 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29brj\" (UniqueName: \"kubernetes.io/projected/5aa92828-abd4-4f89-9621-5e9830101fca-kube-api-access-29brj\") pod \"barbican-operator-controller-manager-7d9dfd778-vtgdw\" (UID: \"5aa92828-abd4-4f89-9621-5e9830101fca\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vtgdw" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.286653 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-qcc4g"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.295234 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-gtwgq"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.310984 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="fa8175e0c93e033e33e0a667d8b04220a3901467b64921658c170a933445969a" exitCode=0 Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.311057 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"fa8175e0c93e033e33e0a667d8b04220a3901467b64921658c170a933445969a"} Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.311113 4693 scope.go:117] "RemoveContainer" containerID="1c4f05ea0deb5052be910dd2a2555e8c09134b71b724016170b902ec2aaa9b89" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.320637 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhrxn\" (UniqueName: \"kubernetes.io/projected/79466fca-aa64-407e-9488-d89e43d4bed9-kube-api-access-qhrxn\") pod \"horizon-operator-controller-manager-68c6d99b8f-4gqg9\" (UID: \"79466fca-aa64-407e-9488-d89e43d4bed9\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4gqg9" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.320717 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v82wk\" (UniqueName: \"kubernetes.io/projected/b7bce599-dd9d-43c5-b5a9-53a081b6f183-kube-api-access-v82wk\") pod \"infra-operator-controller-manager-57548d458d-46wh4\" (UID: \"b7bce599-dd9d-43c5-b5a9-53a081b6f183\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.320764 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgmgd\" (UniqueName: \"kubernetes.io/projected/2bb25289-630f-46c3-96f0-b5ea8177f5d8-kube-api-access-tgmgd\") pod \"designate-operator-controller-manager-78b4bc895b-q8crl\" (UID: \"2bb25289-630f-46c3-96f0-b5ea8177f5d8\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q8crl" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.320797 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7bce599-dd9d-43c5-b5a9-53a081b6f183-cert\") pod \"infra-operator-controller-manager-57548d458d-46wh4\" (UID: \"b7bce599-dd9d-43c5-b5a9-53a081b6f183\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.320828 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjr5r\" (UniqueName: \"kubernetes.io/projected/7fb21378-fa3f-41a2-a6da-80831acec23c-kube-api-access-pjr5r\") pod \"glance-operator-controller-manager-77987cd8cd-tnzrv\" (UID: \"7fb21378-fa3f-41a2-a6da-80831acec23c\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-tnzrv" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.320911 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prkgl\" (UniqueName: \"kubernetes.io/projected/4ba18ef1-50c1-48d0-9d2e-3c83c65913ab-kube-api-access-prkgl\") pod \"heat-operator-controller-manager-5f64f6f8bb-jgmqz\" (UID: \"4ba18ef1-50c1-48d0-9d2e-3c83c65913ab\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-jgmqz" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.320942 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5m7r\" (UniqueName: \"kubernetes.io/projected/f3a27983-d919-48fb-a227-f6a45efef985-kube-api-access-c5m7r\") pod \"cinder-operator-controller-manager-859b6ccc6-5j4cs\" (UID: \"f3a27983-d919-48fb-a227-f6a45efef985\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-5j4cs" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.328195 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vtgdw" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.349180 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-4mp89"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.350226 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4mp89" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.351998 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-gcxrm" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.354038 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjr5r\" (UniqueName: \"kubernetes.io/projected/7fb21378-fa3f-41a2-a6da-80831acec23c-kube-api-access-pjr5r\") pod \"glance-operator-controller-manager-77987cd8cd-tnzrv\" (UID: \"7fb21378-fa3f-41a2-a6da-80831acec23c\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-tnzrv" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.354086 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgmgd\" (UniqueName: \"kubernetes.io/projected/2bb25289-630f-46c3-96f0-b5ea8177f5d8-kube-api-access-tgmgd\") pod \"designate-operator-controller-manager-78b4bc895b-q8crl\" (UID: \"2bb25289-630f-46c3-96f0-b5ea8177f5d8\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q8crl" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.370656 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prkgl\" (UniqueName: \"kubernetes.io/projected/4ba18ef1-50c1-48d0-9d2e-3c83c65913ab-kube-api-access-prkgl\") pod \"heat-operator-controller-manager-5f64f6f8bb-jgmqz\" (UID: \"4ba18ef1-50c1-48d0-9d2e-3c83c65913ab\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-jgmqz" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.385950 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-4mp89"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.395908 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5m7r\" (UniqueName: \"kubernetes.io/projected/f3a27983-d919-48fb-a227-f6a45efef985-kube-api-access-c5m7r\") pod \"cinder-operator-controller-manager-859b6ccc6-5j4cs\" (UID: \"f3a27983-d919-48fb-a227-f6a45efef985\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-5j4cs" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.406341 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-np6kl"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.407893 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-np6kl" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.428395 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q8crl" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.429397 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhrxn\" (UniqueName: \"kubernetes.io/projected/79466fca-aa64-407e-9488-d89e43d4bed9-kube-api-access-qhrxn\") pod \"horizon-operator-controller-manager-68c6d99b8f-4gqg9\" (UID: \"79466fca-aa64-407e-9488-d89e43d4bed9\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4gqg9" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.429432 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v82wk\" (UniqueName: \"kubernetes.io/projected/b7bce599-dd9d-43c5-b5a9-53a081b6f183-kube-api-access-v82wk\") pod \"infra-operator-controller-manager-57548d458d-46wh4\" (UID: \"b7bce599-dd9d-43c5-b5a9-53a081b6f183\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.429466 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7bce599-dd9d-43c5-b5a9-53a081b6f183-cert\") pod \"infra-operator-controller-manager-57548d458d-46wh4\" (UID: \"b7bce599-dd9d-43c5-b5a9-53a081b6f183\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.429496 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhnm2\" (UniqueName: \"kubernetes.io/projected/65a2270f-58bd-486b-9be3-c85fee980070-kube-api-access-dhnm2\") pod \"keystone-operator-controller-manager-7765d96ddf-gtwgq\" (UID: \"65a2270f-58bd-486b-9be3-c85fee980070\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gtwgq" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.429562 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbhql\" (UniqueName: \"kubernetes.io/projected/37983465-c081-4645-9a0b-47431d284dbe-kube-api-access-tbhql\") pod \"ironic-operator-controller-manager-6c548fd776-qcc4g\" (UID: \"37983465-c081-4645-9a0b-47431d284dbe\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qcc4g" Dec 04 10:00:23 crc kubenswrapper[4693]: E1204 10:00:23.430010 4693 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 10:00:23 crc kubenswrapper[4693]: E1204 10:00:23.430072 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7bce599-dd9d-43c5-b5a9-53a081b6f183-cert podName:b7bce599-dd9d-43c5-b5a9-53a081b6f183 nodeName:}" failed. No retries permitted until 2025-12-04 10:00:23.930055631 +0000 UTC m=+1069.827649384 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7bce599-dd9d-43c5-b5a9-53a081b6f183-cert") pod "infra-operator-controller-manager-57548d458d-46wh4" (UID: "b7bce599-dd9d-43c5-b5a9-53a081b6f183") : secret "infra-operator-webhook-server-cert" not found Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.434384 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-w7zh6"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.446612 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-w7zh6" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.467825 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-tnzrv" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.471238 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-6m2j9" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.473764 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-vg65v" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.522039 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-w7zh6"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.538134 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhrxn\" (UniqueName: \"kubernetes.io/projected/79466fca-aa64-407e-9488-d89e43d4bed9-kube-api-access-qhrxn\") pod \"horizon-operator-controller-manager-68c6d99b8f-4gqg9\" (UID: \"79466fca-aa64-407e-9488-d89e43d4bed9\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4gqg9" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.539299 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-jgmqz" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.540274 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbhql\" (UniqueName: \"kubernetes.io/projected/37983465-c081-4645-9a0b-47431d284dbe-kube-api-access-tbhql\") pod \"ironic-operator-controller-manager-6c548fd776-qcc4g\" (UID: \"37983465-c081-4645-9a0b-47431d284dbe\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qcc4g" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.540322 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrb2z\" (UniqueName: \"kubernetes.io/projected/1de6adcf-e847-4a10-af8c-683f83c32551-kube-api-access-nrb2z\") pod \"manila-operator-controller-manager-7c79b5df47-4mp89\" (UID: \"1de6adcf-e847-4a10-af8c-683f83c32551\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4mp89" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.540368 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tjw8\" (UniqueName: \"kubernetes.io/projected/ae731a83-bab7-4843-b413-e8b03a3ca1c3-kube-api-access-9tjw8\") pod \"mariadb-operator-controller-manager-56bbcc9d85-np6kl\" (UID: \"ae731a83-bab7-4843-b413-e8b03a3ca1c3\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-np6kl" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.540428 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8mr8\" (UniqueName: \"kubernetes.io/projected/92ac4c28-9d59-4955-b5cf-ae45e97fdeed-kube-api-access-m8mr8\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-w7zh6\" (UID: \"92ac4c28-9d59-4955-b5cf-ae45e97fdeed\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-w7zh6" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.540465 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhnm2\" (UniqueName: \"kubernetes.io/projected/65a2270f-58bd-486b-9be3-c85fee980070-kube-api-access-dhnm2\") pod \"keystone-operator-controller-manager-7765d96ddf-gtwgq\" (UID: \"65a2270f-58bd-486b-9be3-c85fee980070\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gtwgq" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.541282 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4gqg9" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.567580 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v82wk\" (UniqueName: \"kubernetes.io/projected/b7bce599-dd9d-43c5-b5a9-53a081b6f183-kube-api-access-v82wk\") pod \"infra-operator-controller-manager-57548d458d-46wh4\" (UID: \"b7bce599-dd9d-43c5-b5a9-53a081b6f183\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.693454 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhnm2\" (UniqueName: \"kubernetes.io/projected/65a2270f-58bd-486b-9be3-c85fee980070-kube-api-access-dhnm2\") pod \"keystone-operator-controller-manager-7765d96ddf-gtwgq\" (UID: \"65a2270f-58bd-486b-9be3-c85fee980070\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gtwgq" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.698993 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-5j4cs" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.699681 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tjw8\" (UniqueName: \"kubernetes.io/projected/ae731a83-bab7-4843-b413-e8b03a3ca1c3-kube-api-access-9tjw8\") pod \"mariadb-operator-controller-manager-56bbcc9d85-np6kl\" (UID: \"ae731a83-bab7-4843-b413-e8b03a3ca1c3\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-np6kl" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.699753 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8mr8\" (UniqueName: \"kubernetes.io/projected/92ac4c28-9d59-4955-b5cf-ae45e97fdeed-kube-api-access-m8mr8\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-w7zh6\" (UID: \"92ac4c28-9d59-4955-b5cf-ae45e97fdeed\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-w7zh6" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.699910 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrb2z\" (UniqueName: \"kubernetes.io/projected/1de6adcf-e847-4a10-af8c-683f83c32551-kube-api-access-nrb2z\") pod \"manila-operator-controller-manager-7c79b5df47-4mp89\" (UID: \"1de6adcf-e847-4a10-af8c-683f83c32551\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4mp89" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.700783 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-np6kl"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.706594 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-klr7c"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.708088 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-klr7c" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.725687 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-pgkzg" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.727443 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbhql\" (UniqueName: \"kubernetes.io/projected/37983465-c081-4645-9a0b-47431d284dbe-kube-api-access-tbhql\") pod \"ironic-operator-controller-manager-6c548fd776-qcc4g\" (UID: \"37983465-c081-4645-9a0b-47431d284dbe\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qcc4g" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.741516 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-zg6gn"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.743026 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-zg6gn" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.746932 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-b4vkq" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.753057 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tjw8\" (UniqueName: \"kubernetes.io/projected/ae731a83-bab7-4843-b413-e8b03a3ca1c3-kube-api-access-9tjw8\") pod \"mariadb-operator-controller-manager-56bbcc9d85-np6kl\" (UID: \"ae731a83-bab7-4843-b413-e8b03a3ca1c3\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-np6kl" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.768707 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-klr7c"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.786980 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-np6kl" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.799445 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8mr8\" (UniqueName: \"kubernetes.io/projected/92ac4c28-9d59-4955-b5cf-ae45e97fdeed-kube-api-access-m8mr8\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-w7zh6\" (UID: \"92ac4c28-9d59-4955-b5cf-ae45e97fdeed\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-w7zh6" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.802603 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-zg6gn"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.804513 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrb2z\" (UniqueName: \"kubernetes.io/projected/1de6adcf-e847-4a10-af8c-683f83c32551-kube-api-access-nrb2z\") pod \"manila-operator-controller-manager-7c79b5df47-4mp89\" (UID: \"1de6adcf-e847-4a10-af8c-683f83c32551\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4mp89" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.809417 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.811776 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.822200 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.825604 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-cxrdm"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.827124 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-cxrdm" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.836675 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-fjfsb" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.837481 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-jrzrf" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.837682 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-4zbtc"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.838508 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-w7zh6" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.839406 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.839516 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4zbtc" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.854615 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-vqnfm" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.855858 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gfvq4"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.857197 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gfvq4" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.867915 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-cxrdm"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.868257 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-scp8g" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.895615 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gfvq4"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.911473 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmtpz\" (UniqueName: \"kubernetes.io/projected/23baa4a2-ca26-41a5-968c-f642ca80d1fa-kube-api-access-bmtpz\") pod \"swift-operator-controller-manager-5f8c65bbfc-gfvq4\" (UID: \"23baa4a2-ca26-41a5-968c-f642ca80d1fa\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gfvq4" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.911551 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncww4\" (UniqueName: \"kubernetes.io/projected/b352e856-6946-41ed-8d06-46b1ab00185e-kube-api-access-ncww4\") pod \"ovn-operator-controller-manager-b6456fdb6-cxrdm\" (UID: \"b352e856-6946-41ed-8d06-46b1ab00185e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-cxrdm" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.911594 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t6kz\" (UniqueName: \"kubernetes.io/projected/9b4ce9a3-bc13-4726-af72-c0f4c619efec-kube-api-access-7t6kz\") pod \"octavia-operator-controller-manager-998648c74-zg6gn\" (UID: \"9b4ce9a3-bc13-4726-af72-c0f4c619efec\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-zg6gn" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.911672 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9npbf\" (UniqueName: \"kubernetes.io/projected/287b0c68-a203-4af6-b654-2eb97b004cdc-kube-api-access-9npbf\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7\" (UID: \"287b0c68-a203-4af6-b654-2eb97b004cdc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.911702 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8v5m\" (UniqueName: \"kubernetes.io/projected/75a5a37b-eb32-4654-85f7-1c7b9de1c247-kube-api-access-w8v5m\") pod \"placement-operator-controller-manager-78f8948974-4zbtc\" (UID: \"75a5a37b-eb32-4654-85f7-1c7b9de1c247\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-4zbtc" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.911717 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7\" (UID: \"287b0c68-a203-4af6-b654-2eb97b004cdc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.911735 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-559jg\" (UniqueName: \"kubernetes.io/projected/fc97bab3-20bf-4931-868d-a20ad433cc81-kube-api-access-559jg\") pod \"nova-operator-controller-manager-697bc559fc-klr7c\" (UID: \"fc97bab3-20bf-4931-868d-a20ad433cc81\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-klr7c" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.920952 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qcc4g" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.940177 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gtwgq" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.941429 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zmpkb"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.942561 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zmpkb" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.954569 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-2bp9k" Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.954837 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-4zbtc"] Dec 04 10:00:23 crc kubenswrapper[4693]: I1204 10:00:23.985137 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-vfrzv"] Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.004125 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-vfrzv" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.009466 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-rtcvq" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.013265 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zmpkb"] Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.013875 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncww4\" (UniqueName: \"kubernetes.io/projected/b352e856-6946-41ed-8d06-46b1ab00185e-kube-api-access-ncww4\") pod \"ovn-operator-controller-manager-b6456fdb6-cxrdm\" (UID: \"b352e856-6946-41ed-8d06-46b1ab00185e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-cxrdm" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.014169 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t6kz\" (UniqueName: \"kubernetes.io/projected/9b4ce9a3-bc13-4726-af72-c0f4c619efec-kube-api-access-7t6kz\") pod \"octavia-operator-controller-manager-998648c74-zg6gn\" (UID: \"9b4ce9a3-bc13-4726-af72-c0f4c619efec\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-zg6gn" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.014252 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7bce599-dd9d-43c5-b5a9-53a081b6f183-cert\") pod \"infra-operator-controller-manager-57548d458d-46wh4\" (UID: \"b7bce599-dd9d-43c5-b5a9-53a081b6f183\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.014359 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9npbf\" (UniqueName: \"kubernetes.io/projected/287b0c68-a203-4af6-b654-2eb97b004cdc-kube-api-access-9npbf\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7\" (UID: \"287b0c68-a203-4af6-b654-2eb97b004cdc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.014379 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8v5m\" (UniqueName: \"kubernetes.io/projected/75a5a37b-eb32-4654-85f7-1c7b9de1c247-kube-api-access-w8v5m\") pod \"placement-operator-controller-manager-78f8948974-4zbtc\" (UID: \"75a5a37b-eb32-4654-85f7-1c7b9de1c247\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-4zbtc" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.014397 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7\" (UID: \"287b0c68-a203-4af6-b654-2eb97b004cdc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.014419 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-559jg\" (UniqueName: \"kubernetes.io/projected/fc97bab3-20bf-4931-868d-a20ad433cc81-kube-api-access-559jg\") pod \"nova-operator-controller-manager-697bc559fc-klr7c\" (UID: \"fc97bab3-20bf-4931-868d-a20ad433cc81\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-klr7c" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.014482 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmtpz\" (UniqueName: \"kubernetes.io/projected/23baa4a2-ca26-41a5-968c-f642ca80d1fa-kube-api-access-bmtpz\") pod \"swift-operator-controller-manager-5f8c65bbfc-gfvq4\" (UID: \"23baa4a2-ca26-41a5-968c-f642ca80d1fa\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gfvq4" Dec 04 10:00:24 crc kubenswrapper[4693]: E1204 10:00:24.014814 4693 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 10:00:24 crc kubenswrapper[4693]: E1204 10:00:24.014859 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7bce599-dd9d-43c5-b5a9-53a081b6f183-cert podName:b7bce599-dd9d-43c5-b5a9-53a081b6f183 nodeName:}" failed. No retries permitted until 2025-12-04 10:00:25.014844986 +0000 UTC m=+1070.912438729 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7bce599-dd9d-43c5-b5a9-53a081b6f183-cert") pod "infra-operator-controller-manager-57548d458d-46wh4" (UID: "b7bce599-dd9d-43c5-b5a9-53a081b6f183") : secret "infra-operator-webhook-server-cert" not found Dec 04 10:00:24 crc kubenswrapper[4693]: E1204 10:00:24.014966 4693 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 10:00:24 crc kubenswrapper[4693]: E1204 10:00:24.015010 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert podName:287b0c68-a203-4af6-b654-2eb97b004cdc nodeName:}" failed. No retries permitted until 2025-12-04 10:00:24.51499385 +0000 UTC m=+1070.412587603 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" (UID: "287b0c68-a203-4af6-b654-2eb97b004cdc") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.040344 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-vfrzv"] Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.050004 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4mp89" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.052313 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncww4\" (UniqueName: \"kubernetes.io/projected/b352e856-6946-41ed-8d06-46b1ab00185e-kube-api-access-ncww4\") pod \"ovn-operator-controller-manager-b6456fdb6-cxrdm\" (UID: \"b352e856-6946-41ed-8d06-46b1ab00185e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-cxrdm" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.052416 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-pzpj7"] Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.053610 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzpj7" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.068877 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8v5m\" (UniqueName: \"kubernetes.io/projected/75a5a37b-eb32-4654-85f7-1c7b9de1c247-kube-api-access-w8v5m\") pod \"placement-operator-controller-manager-78f8948974-4zbtc\" (UID: \"75a5a37b-eb32-4654-85f7-1c7b9de1c247\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-4zbtc" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.073019 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-pzpj7"] Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.073529 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-kg9hw" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.073555 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmtpz\" (UniqueName: \"kubernetes.io/projected/23baa4a2-ca26-41a5-968c-f642ca80d1fa-kube-api-access-bmtpz\") pod \"swift-operator-controller-manager-5f8c65bbfc-gfvq4\" (UID: \"23baa4a2-ca26-41a5-968c-f642ca80d1fa\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gfvq4" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.074653 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t6kz\" (UniqueName: \"kubernetes.io/projected/9b4ce9a3-bc13-4726-af72-c0f4c619efec-kube-api-access-7t6kz\") pod \"octavia-operator-controller-manager-998648c74-zg6gn\" (UID: \"9b4ce9a3-bc13-4726-af72-c0f4c619efec\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-zg6gn" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.078301 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9npbf\" (UniqueName: \"kubernetes.io/projected/287b0c68-a203-4af6-b654-2eb97b004cdc-kube-api-access-9npbf\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7\" (UID: \"287b0c68-a203-4af6-b654-2eb97b004cdc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.081762 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-559jg\" (UniqueName: \"kubernetes.io/projected/fc97bab3-20bf-4931-868d-a20ad433cc81-kube-api-access-559jg\") pod \"nova-operator-controller-manager-697bc559fc-klr7c\" (UID: \"fc97bab3-20bf-4931-868d-a20ad433cc81\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-klr7c" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.116267 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4mhn\" (UniqueName: \"kubernetes.io/projected/a37bfc80-1ecc-4547-8fbe-be223b9a5cc2-kube-api-access-f4mhn\") pod \"telemetry-operator-controller-manager-76cc84c6bb-zmpkb\" (UID: \"a37bfc80-1ecc-4547-8fbe-be223b9a5cc2\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zmpkb" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.116758 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhs86\" (UniqueName: \"kubernetes.io/projected/b2d582c6-b444-4591-93c3-7681714732bc-kube-api-access-nhs86\") pod \"test-operator-controller-manager-5854674fcc-vfrzv\" (UID: \"b2d582c6-b444-4591-93c3-7681714732bc\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-vfrzv" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.118060 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv"] Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.119668 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.136684 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.137007 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-czv5q" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.137237 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.174266 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-klr7c" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.192627 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv"] Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.197471 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4zbtc" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.218665 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xzz9b"] Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.219441 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-zg6gn" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.219761 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xzz9b" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.221085 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.221115 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.221137 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kghfm\" (UniqueName: \"kubernetes.io/projected/383c7650-d095-4996-88c6-06d999b1973b-kube-api-access-kghfm\") pod \"watcher-operator-controller-manager-769dc69bc-pzpj7\" (UID: \"383c7650-d095-4996-88c6-06d999b1973b\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzpj7" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.221178 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxhx5\" (UniqueName: \"kubernetes.io/projected/9b4532d5-fce3-43a3-b72c-c0752eae7945-kube-api-access-mxhx5\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.221224 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4mhn\" (UniqueName: \"kubernetes.io/projected/a37bfc80-1ecc-4547-8fbe-be223b9a5cc2-kube-api-access-f4mhn\") pod \"telemetry-operator-controller-manager-76cc84c6bb-zmpkb\" (UID: \"a37bfc80-1ecc-4547-8fbe-be223b9a5cc2\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zmpkb" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.221255 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhs86\" (UniqueName: \"kubernetes.io/projected/b2d582c6-b444-4591-93c3-7681714732bc-kube-api-access-nhs86\") pod \"test-operator-controller-manager-5854674fcc-vfrzv\" (UID: \"b2d582c6-b444-4591-93c3-7681714732bc\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-vfrzv" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.224484 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-pvrl2" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.225174 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xzz9b"] Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.252968 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhs86\" (UniqueName: \"kubernetes.io/projected/b2d582c6-b444-4591-93c3-7681714732bc-kube-api-access-nhs86\") pod \"test-operator-controller-manager-5854674fcc-vfrzv\" (UID: \"b2d582c6-b444-4591-93c3-7681714732bc\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-vfrzv" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.253998 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4mhn\" (UniqueName: \"kubernetes.io/projected/a37bfc80-1ecc-4547-8fbe-be223b9a5cc2-kube-api-access-f4mhn\") pod \"telemetry-operator-controller-manager-76cc84c6bb-zmpkb\" (UID: \"a37bfc80-1ecc-4547-8fbe-be223b9a5cc2\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zmpkb" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.259702 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-cxrdm" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.274949 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gfvq4" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.317406 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zmpkb" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.322891 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.322946 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.322977 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kghfm\" (UniqueName: \"kubernetes.io/projected/383c7650-d095-4996-88c6-06d999b1973b-kube-api-access-kghfm\") pod \"watcher-operator-controller-manager-769dc69bc-pzpj7\" (UID: \"383c7650-d095-4996-88c6-06d999b1973b\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzpj7" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.323013 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzdrz\" (UniqueName: \"kubernetes.io/projected/4ffbc9ab-625c-467a-b3cb-017b4167d8a1-kube-api-access-dzdrz\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xzz9b\" (UID: \"4ffbc9ab-625c-467a-b3cb-017b4167d8a1\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xzz9b" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.323065 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxhx5\" (UniqueName: \"kubernetes.io/projected/9b4532d5-fce3-43a3-b72c-c0752eae7945-kube-api-access-mxhx5\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:24 crc kubenswrapper[4693]: E1204 10:00:24.323272 4693 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 10:00:24 crc kubenswrapper[4693]: E1204 10:00:24.323326 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs podName:9b4532d5-fce3-43a3-b72c-c0752eae7945 nodeName:}" failed. No retries permitted until 2025-12-04 10:00:24.823308233 +0000 UTC m=+1070.720901986 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs") pod "openstack-operator-controller-manager-5bf9d46bf4-jn6kv" (UID: "9b4532d5-fce3-43a3-b72c-c0752eae7945") : secret "webhook-server-cert" not found Dec 04 10:00:24 crc kubenswrapper[4693]: E1204 10:00:24.323269 4693 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 10:00:24 crc kubenswrapper[4693]: E1204 10:00:24.323408 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs podName:9b4532d5-fce3-43a3-b72c-c0752eae7945 nodeName:}" failed. No retries permitted until 2025-12-04 10:00:24.823399236 +0000 UTC m=+1070.720992989 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs") pod "openstack-operator-controller-manager-5bf9d46bf4-jn6kv" (UID: "9b4532d5-fce3-43a3-b72c-c0752eae7945") : secret "metrics-server-cert" not found Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.342801 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"17b9e6dc7a80fb27b6e0a13555809c35b6f6158654239504ab57d225574ce7bf"} Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.344520 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxhx5\" (UniqueName: \"kubernetes.io/projected/9b4532d5-fce3-43a3-b72c-c0752eae7945-kube-api-access-mxhx5\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.368295 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kghfm\" (UniqueName: \"kubernetes.io/projected/383c7650-d095-4996-88c6-06d999b1973b-kube-api-access-kghfm\") pod \"watcher-operator-controller-manager-769dc69bc-pzpj7\" (UID: \"383c7650-d095-4996-88c6-06d999b1973b\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzpj7" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.392656 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-vfrzv" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.403785 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzpj7" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.433006 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzdrz\" (UniqueName: \"kubernetes.io/projected/4ffbc9ab-625c-467a-b3cb-017b4167d8a1-kube-api-access-dzdrz\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xzz9b\" (UID: \"4ffbc9ab-625c-467a-b3cb-017b4167d8a1\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xzz9b" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.457960 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzdrz\" (UniqueName: \"kubernetes.io/projected/4ffbc9ab-625c-467a-b3cb-017b4167d8a1-kube-api-access-dzdrz\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xzz9b\" (UID: \"4ffbc9ab-625c-467a-b3cb-017b4167d8a1\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xzz9b" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.535920 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7\" (UID: \"287b0c68-a203-4af6-b654-2eb97b004cdc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" Dec 04 10:00:24 crc kubenswrapper[4693]: E1204 10:00:24.537223 4693 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 10:00:24 crc kubenswrapper[4693]: E1204 10:00:24.537298 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert podName:287b0c68-a203-4af6-b654-2eb97b004cdc nodeName:}" failed. No retries permitted until 2025-12-04 10:00:25.537280791 +0000 UTC m=+1071.434874544 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" (UID: "287b0c68-a203-4af6-b654-2eb97b004cdc") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.571486 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xzz9b" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.847181 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.847583 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:24 crc kubenswrapper[4693]: E1204 10:00:24.847809 4693 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 10:00:24 crc kubenswrapper[4693]: E1204 10:00:24.847872 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs podName:9b4532d5-fce3-43a3-b72c-c0752eae7945 nodeName:}" failed. No retries permitted until 2025-12-04 10:00:25.847852656 +0000 UTC m=+1071.745446409 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs") pod "openstack-operator-controller-manager-5bf9d46bf4-jn6kv" (UID: "9b4532d5-fce3-43a3-b72c-c0752eae7945") : secret "webhook-server-cert" not found Dec 04 10:00:24 crc kubenswrapper[4693]: E1204 10:00:24.848257 4693 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 10:00:24 crc kubenswrapper[4693]: E1204 10:00:24.848294 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs podName:9b4532d5-fce3-43a3-b72c-c0752eae7945 nodeName:}" failed. No retries permitted until 2025-12-04 10:00:25.848284087 +0000 UTC m=+1071.745877840 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs") pod "openstack-operator-controller-manager-5bf9d46bf4-jn6kv" (UID: "9b4532d5-fce3-43a3-b72c-c0752eae7945") : secret "metrics-server-cert" not found Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.880803 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-vtgdw"] Dec 04 10:00:24 crc kubenswrapper[4693]: I1204 10:00:24.890695 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-q8crl"] Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.027982 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4gqg9"] Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.034192 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-jgmqz"] Dec 04 10:00:25 crc kubenswrapper[4693]: W1204 10:00:25.038150 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ba18ef1_50c1_48d0_9d2e_3c83c65913ab.slice/crio-634d5cdd5d0cab805c440f6272d257954e225c9c450312d94181f2613cd43918 WatchSource:0}: Error finding container 634d5cdd5d0cab805c440f6272d257954e225c9c450312d94181f2613cd43918: Status 404 returned error can't find the container with id 634d5cdd5d0cab805c440f6272d257954e225c9c450312d94181f2613cd43918 Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.041847 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-tnzrv"] Dec 04 10:00:25 crc kubenswrapper[4693]: W1204 10:00:25.044767 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fb21378_fa3f_41a2_a6da_80831acec23c.slice/crio-9c9ff0a4f4e07e72d4c59c0ba91a812baa3c7a616502f8c11e6a369ee61be8d8 WatchSource:0}: Error finding container 9c9ff0a4f4e07e72d4c59c0ba91a812baa3c7a616502f8c11e6a369ee61be8d8: Status 404 returned error can't find the container with id 9c9ff0a4f4e07e72d4c59c0ba91a812baa3c7a616502f8c11e6a369ee61be8d8 Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.057817 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7bce599-dd9d-43c5-b5a9-53a081b6f183-cert\") pod \"infra-operator-controller-manager-57548d458d-46wh4\" (UID: \"b7bce599-dd9d-43c5-b5a9-53a081b6f183\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.058107 4693 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.058230 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7bce599-dd9d-43c5-b5a9-53a081b6f183-cert podName:b7bce599-dd9d-43c5-b5a9-53a081b6f183 nodeName:}" failed. No retries permitted until 2025-12-04 10:00:27.058204655 +0000 UTC m=+1072.955798458 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7bce599-dd9d-43c5-b5a9-53a081b6f183-cert") pod "infra-operator-controller-manager-57548d458d-46wh4" (UID: "b7bce599-dd9d-43c5-b5a9-53a081b6f183") : secret "infra-operator-webhook-server-cert" not found Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.189133 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-np6kl"] Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.222224 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-gtwgq"] Dec 04 10:00:25 crc kubenswrapper[4693]: W1204 10:00:25.223377 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65a2270f_58bd_486b_9be3_c85fee980070.slice/crio-3a9f6528e0c8c08266c7c4b021ab435535bdac79370a4a5f0c07a8a4a31ffc78 WatchSource:0}: Error finding container 3a9f6528e0c8c08266c7c4b021ab435535bdac79370a4a5f0c07a8a4a31ffc78: Status 404 returned error can't find the container with id 3a9f6528e0c8c08266c7c4b021ab435535bdac79370a4a5f0c07a8a4a31ffc78 Dec 04 10:00:25 crc kubenswrapper[4693]: W1204 10:00:25.230980 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37983465_c081_4645_9a0b_47431d284dbe.slice/crio-62ec5163871d063c64aa2de5417fa066f925ff00363e713bf5261488e7484f52 WatchSource:0}: Error finding container 62ec5163871d063c64aa2de5417fa066f925ff00363e713bf5261488e7484f52: Status 404 returned error can't find the container with id 62ec5163871d063c64aa2de5417fa066f925ff00363e713bf5261488e7484f52 Dec 04 10:00:25 crc kubenswrapper[4693]: W1204 10:00:25.235500 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92ac4c28_9d59_4955_b5cf_ae45e97fdeed.slice/crio-e3e0a82b8c1fc3f721dd0ed19f2d24c8e1f0078a61a6d381567ad933a2963d0c WatchSource:0}: Error finding container e3e0a82b8c1fc3f721dd0ed19f2d24c8e1f0078a61a6d381567ad933a2963d0c: Status 404 returned error can't find the container with id e3e0a82b8c1fc3f721dd0ed19f2d24c8e1f0078a61a6d381567ad933a2963d0c Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.237226 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-klr7c"] Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.245164 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-5j4cs"] Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.253504 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-qcc4g"] Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.257735 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-w7zh6"] Dec 04 10:00:25 crc kubenswrapper[4693]: W1204 10:00:25.271655 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3a27983_d919_48fb_a227_f6a45efef985.slice/crio-64e142ee150f7866fb9b9ad6406f04281a452e565004fd9df7bfb88906498cf0 WatchSource:0}: Error finding container 64e142ee150f7866fb9b9ad6406f04281a452e565004fd9df7bfb88906498cf0: Status 404 returned error can't find the container with id 64e142ee150f7866fb9b9ad6406f04281a452e565004fd9df7bfb88906498cf0 Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.362973 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gtwgq" event={"ID":"65a2270f-58bd-486b-9be3-c85fee980070","Type":"ContainerStarted","Data":"3a9f6528e0c8c08266c7c4b021ab435535bdac79370a4a5f0c07a8a4a31ffc78"} Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.376360 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q8crl" event={"ID":"2bb25289-630f-46c3-96f0-b5ea8177f5d8","Type":"ContainerStarted","Data":"a268997fc66df5798c2388d68cdbbe85a5c208ec98446a9c65f1ec775b31d128"} Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.386099 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qcc4g" event={"ID":"37983465-c081-4645-9a0b-47431d284dbe","Type":"ContainerStarted","Data":"62ec5163871d063c64aa2de5417fa066f925ff00363e713bf5261488e7484f52"} Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.387787 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-tnzrv" event={"ID":"7fb21378-fa3f-41a2-a6da-80831acec23c","Type":"ContainerStarted","Data":"9c9ff0a4f4e07e72d4c59c0ba91a812baa3c7a616502f8c11e6a369ee61be8d8"} Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.394262 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-5j4cs" event={"ID":"f3a27983-d919-48fb-a227-f6a45efef985","Type":"ContainerStarted","Data":"64e142ee150f7866fb9b9ad6406f04281a452e565004fd9df7bfb88906498cf0"} Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.399827 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-np6kl" event={"ID":"ae731a83-bab7-4843-b413-e8b03a3ca1c3","Type":"ContainerStarted","Data":"bd9a80de154ee93f38475e10ad33c2185f872e925d5f63af4bf0f51d3d2f0b36"} Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.430357 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-jgmqz" event={"ID":"4ba18ef1-50c1-48d0-9d2e-3c83c65913ab","Type":"ContainerStarted","Data":"634d5cdd5d0cab805c440f6272d257954e225c9c450312d94181f2613cd43918"} Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.438684 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-w7zh6" event={"ID":"92ac4c28-9d59-4955-b5cf-ae45e97fdeed","Type":"ContainerStarted","Data":"e3e0a82b8c1fc3f721dd0ed19f2d24c8e1f0078a61a6d381567ad933a2963d0c"} Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.441174 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vtgdw" event={"ID":"5aa92828-abd4-4f89-9621-5e9830101fca","Type":"ContainerStarted","Data":"3191012116cf443bf0b560268788b9786bd9ba6ce4626efd98c8d3bb69427fce"} Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.443465 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-klr7c" event={"ID":"fc97bab3-20bf-4931-868d-a20ad433cc81","Type":"ContainerStarted","Data":"765ec8f425f1f3f8ea4403cd692341f20d668c0aac6e1a96e8c027ae78894085"} Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.445874 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4gqg9" event={"ID":"79466fca-aa64-407e-9488-d89e43d4bed9","Type":"ContainerStarted","Data":"b8004e742e00ef98661c4033d3681f87e54181b4f94478eeb6ee96cbb2e27360"} Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.569336 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7\" (UID: \"287b0c68-a203-4af6-b654-2eb97b004cdc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.569552 4693 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.569605 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert podName:287b0c68-a203-4af6-b654-2eb97b004cdc nodeName:}" failed. No retries permitted until 2025-12-04 10:00:27.569591221 +0000 UTC m=+1073.467184974 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" (UID: "287b0c68-a203-4af6-b654-2eb97b004cdc") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.608947 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-zg6gn"] Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.618194 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-pzpj7"] Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.622133 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gfvq4"] Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.656535 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-cxrdm"] Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.671894 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-vfrzv"] Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.689805 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ncww4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-cxrdm_openstack-operators(b352e856-6946-41ed-8d06-46b1ab00185e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.694979 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ncww4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-cxrdm_openstack-operators(b352e856-6946-41ed-8d06-46b1ab00185e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.694997 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nhs86,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-vfrzv_openstack-operators(b2d582c6-b444-4591-93c3-7681714732bc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.697573 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-cxrdm" podUID="b352e856-6946-41ed-8d06-46b1ab00185e" Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.698231 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nhs86,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-vfrzv_openstack-operators(b2d582c6-b444-4591-93c3-7681714732bc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.701662 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-vfrzv" podUID="b2d582c6-b444-4591-93c3-7681714732bc" Dec 04 10:00:25 crc kubenswrapper[4693]: W1204 10:00:25.706476 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1de6adcf_e847_4a10_af8c_683f83c32551.slice/crio-bf6ed282beb86db8ddb6e2a467aec788f12fed7fd04d11dcc7bee6259f54be14 WatchSource:0}: Error finding container bf6ed282beb86db8ddb6e2a467aec788f12fed7fd04d11dcc7bee6259f54be14: Status 404 returned error can't find the container with id bf6ed282beb86db8ddb6e2a467aec788f12fed7fd04d11dcc7bee6259f54be14 Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.717733 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-4zbtc"] Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.777910 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-4mp89"] Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.785670 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zmpkb"] Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.788819 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nrb2z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7c79b5df47-4mp89_openstack-operators(1de6adcf-e847-4a10-af8c-683f83c32551): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.788773 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f4mhn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-zmpkb_openstack-operators(a37bfc80-1ecc-4547-8fbe-be223b9a5cc2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.788961 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w8v5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-4zbtc_openstack-operators(75a5a37b-eb32-4654-85f7-1c7b9de1c247): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.791124 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f4mhn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-zmpkb_openstack-operators(a37bfc80-1ecc-4547-8fbe-be223b9a5cc2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.791211 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nrb2z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7c79b5df47-4mp89_openstack-operators(1de6adcf-e847-4a10-af8c-683f83c32551): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.791292 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dzdrz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-xzz9b_openstack-operators(4ffbc9ab-625c-467a-b3cb-017b4167d8a1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.791892 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w8v5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-4zbtc_openstack-operators(75a5a37b-eb32-4654-85f7-1c7b9de1c247): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.792435 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xzz9b" podUID="4ffbc9ab-625c-467a-b3cb-017b4167d8a1" Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.792491 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zmpkb" podUID="a37bfc80-1ecc-4547-8fbe-be223b9a5cc2" Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.792518 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4mp89" podUID="1de6adcf-e847-4a10-af8c-683f83c32551" Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.793188 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4zbtc" podUID="75a5a37b-eb32-4654-85f7-1c7b9de1c247" Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.795648 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xzz9b"] Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.879854 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:25 crc kubenswrapper[4693]: I1204 10:00:25.880006 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.880364 4693 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.880603 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs podName:9b4532d5-fce3-43a3-b72c-c0752eae7945 nodeName:}" failed. No retries permitted until 2025-12-04 10:00:27.880577707 +0000 UTC m=+1073.778171450 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs") pod "openstack-operator-controller-manager-5bf9d46bf4-jn6kv" (UID: "9b4532d5-fce3-43a3-b72c-c0752eae7945") : secret "webhook-server-cert" not found Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.880699 4693 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 10:00:25 crc kubenswrapper[4693]: E1204 10:00:25.880803 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs podName:9b4532d5-fce3-43a3-b72c-c0752eae7945 nodeName:}" failed. No retries permitted until 2025-12-04 10:00:27.880773932 +0000 UTC m=+1073.778367875 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs") pod "openstack-operator-controller-manager-5bf9d46bf4-jn6kv" (UID: "9b4532d5-fce3-43a3-b72c-c0752eae7945") : secret "metrics-server-cert" not found Dec 04 10:00:26 crc kubenswrapper[4693]: E1204 10:00:26.469791 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zmpkb" podUID="a37bfc80-1ecc-4547-8fbe-be223b9a5cc2" Dec 04 10:00:26 crc kubenswrapper[4693]: E1204 10:00:26.477437 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-vfrzv" podUID="b2d582c6-b444-4591-93c3-7681714732bc" Dec 04 10:00:26 crc kubenswrapper[4693]: I1204 10:00:26.486399 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zmpkb" event={"ID":"a37bfc80-1ecc-4547-8fbe-be223b9a5cc2","Type":"ContainerStarted","Data":"f8276ac7be9133c2f64b540782847a48c521d1218fa04048efa5cd52880c7d50"} Dec 04 10:00:26 crc kubenswrapper[4693]: I1204 10:00:26.486452 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gfvq4" event={"ID":"23baa4a2-ca26-41a5-968c-f642ca80d1fa","Type":"ContainerStarted","Data":"ef4164073c84769bd065bed0d9548b1546f40d0966cf796edef05dd2a64f1099"} Dec 04 10:00:26 crc kubenswrapper[4693]: I1204 10:00:26.486469 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-vfrzv" event={"ID":"b2d582c6-b444-4591-93c3-7681714732bc","Type":"ContainerStarted","Data":"b3c8c4388e8ebd786b948153cf7b4e4dedbd2eae5a3b2fb8d7da954376f834be"} Dec 04 10:00:26 crc kubenswrapper[4693]: I1204 10:00:26.486482 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-zg6gn" event={"ID":"9b4ce9a3-bc13-4726-af72-c0f4c619efec","Type":"ContainerStarted","Data":"9a290103595c3988bd1a252e143158f20bd4424f96be6886497a55e3625ca7c2"} Dec 04 10:00:26 crc kubenswrapper[4693]: I1204 10:00:26.486495 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4zbtc" event={"ID":"75a5a37b-eb32-4654-85f7-1c7b9de1c247","Type":"ContainerStarted","Data":"e531a974fe9aa7f5364116a03050f9f24c73d807b6c543f962217ed4c8d56483"} Dec 04 10:00:26 crc kubenswrapper[4693]: I1204 10:00:26.491013 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xzz9b" event={"ID":"4ffbc9ab-625c-467a-b3cb-017b4167d8a1","Type":"ContainerStarted","Data":"d4ed010a62e466814d57e4c3431b3a35b0315fdff9c928bca840382e7f509901"} Dec 04 10:00:26 crc kubenswrapper[4693]: E1204 10:00:26.492275 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xzz9b" podUID="4ffbc9ab-625c-467a-b3cb-017b4167d8a1" Dec 04 10:00:26 crc kubenswrapper[4693]: E1204 10:00:26.492609 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4zbtc" podUID="75a5a37b-eb32-4654-85f7-1c7b9de1c247" Dec 04 10:00:26 crc kubenswrapper[4693]: I1204 10:00:26.502618 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzpj7" event={"ID":"383c7650-d095-4996-88c6-06d999b1973b","Type":"ContainerStarted","Data":"0fc25a01467e7eb6378d647a8787962fac453391b7b0a269862b8971d354afba"} Dec 04 10:00:26 crc kubenswrapper[4693]: I1204 10:00:26.512726 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-cxrdm" event={"ID":"b352e856-6946-41ed-8d06-46b1ab00185e","Type":"ContainerStarted","Data":"63686c00f5a0b59143bee4a976fc6e4dd832996efc4f949e69942554c1e0d0f6"} Dec 04 10:00:26 crc kubenswrapper[4693]: I1204 10:00:26.531134 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4mp89" event={"ID":"1de6adcf-e847-4a10-af8c-683f83c32551","Type":"ContainerStarted","Data":"bf6ed282beb86db8ddb6e2a467aec788f12fed7fd04d11dcc7bee6259f54be14"} Dec 04 10:00:26 crc kubenswrapper[4693]: E1204 10:00:26.535969 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4mp89" podUID="1de6adcf-e847-4a10-af8c-683f83c32551" Dec 04 10:00:26 crc kubenswrapper[4693]: E1204 10:00:26.536468 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-cxrdm" podUID="b352e856-6946-41ed-8d06-46b1ab00185e" Dec 04 10:00:27 crc kubenswrapper[4693]: I1204 10:00:27.126848 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7bce599-dd9d-43c5-b5a9-53a081b6f183-cert\") pod \"infra-operator-controller-manager-57548d458d-46wh4\" (UID: \"b7bce599-dd9d-43c5-b5a9-53a081b6f183\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" Dec 04 10:00:27 crc kubenswrapper[4693]: E1204 10:00:27.127406 4693 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 10:00:27 crc kubenswrapper[4693]: E1204 10:00:27.127461 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7bce599-dd9d-43c5-b5a9-53a081b6f183-cert podName:b7bce599-dd9d-43c5-b5a9-53a081b6f183 nodeName:}" failed. No retries permitted until 2025-12-04 10:00:31.127445211 +0000 UTC m=+1077.025038964 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7bce599-dd9d-43c5-b5a9-53a081b6f183-cert") pod "infra-operator-controller-manager-57548d458d-46wh4" (UID: "b7bce599-dd9d-43c5-b5a9-53a081b6f183") : secret "infra-operator-webhook-server-cert" not found Dec 04 10:00:27 crc kubenswrapper[4693]: E1204 10:00:27.564566 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xzz9b" podUID="4ffbc9ab-625c-467a-b3cb-017b4167d8a1" Dec 04 10:00:27 crc kubenswrapper[4693]: E1204 10:00:27.564650 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4mp89" podUID="1de6adcf-e847-4a10-af8c-683f83c32551" Dec 04 10:00:27 crc kubenswrapper[4693]: E1204 10:00:27.565623 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-cxrdm" podUID="b352e856-6946-41ed-8d06-46b1ab00185e" Dec 04 10:00:27 crc kubenswrapper[4693]: E1204 10:00:27.565652 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-vfrzv" podUID="b2d582c6-b444-4591-93c3-7681714732bc" Dec 04 10:00:27 crc kubenswrapper[4693]: E1204 10:00:27.565699 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4zbtc" podUID="75a5a37b-eb32-4654-85f7-1c7b9de1c247" Dec 04 10:00:27 crc kubenswrapper[4693]: E1204 10:00:27.565900 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zmpkb" podUID="a37bfc80-1ecc-4547-8fbe-be223b9a5cc2" Dec 04 10:00:27 crc kubenswrapper[4693]: I1204 10:00:27.640262 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7\" (UID: \"287b0c68-a203-4af6-b654-2eb97b004cdc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" Dec 04 10:00:27 crc kubenswrapper[4693]: E1204 10:00:27.641160 4693 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 10:00:27 crc kubenswrapper[4693]: E1204 10:00:27.641223 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert podName:287b0c68-a203-4af6-b654-2eb97b004cdc nodeName:}" failed. No retries permitted until 2025-12-04 10:00:31.641206001 +0000 UTC m=+1077.538799754 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" (UID: "287b0c68-a203-4af6-b654-2eb97b004cdc") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 10:00:27 crc kubenswrapper[4693]: I1204 10:00:27.948096 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:27 crc kubenswrapper[4693]: I1204 10:00:27.948199 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:27 crc kubenswrapper[4693]: E1204 10:00:27.948283 4693 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 10:00:27 crc kubenswrapper[4693]: E1204 10:00:27.948389 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs podName:9b4532d5-fce3-43a3-b72c-c0752eae7945 nodeName:}" failed. No retries permitted until 2025-12-04 10:00:31.948363523 +0000 UTC m=+1077.845957276 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs") pod "openstack-operator-controller-manager-5bf9d46bf4-jn6kv" (UID: "9b4532d5-fce3-43a3-b72c-c0752eae7945") : secret "metrics-server-cert" not found Dec 04 10:00:27 crc kubenswrapper[4693]: E1204 10:00:27.948402 4693 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 10:00:27 crc kubenswrapper[4693]: E1204 10:00:27.948482 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs podName:9b4532d5-fce3-43a3-b72c-c0752eae7945 nodeName:}" failed. No retries permitted until 2025-12-04 10:00:31.948456045 +0000 UTC m=+1077.846049798 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs") pod "openstack-operator-controller-manager-5bf9d46bf4-jn6kv" (UID: "9b4532d5-fce3-43a3-b72c-c0752eae7945") : secret "webhook-server-cert" not found Dec 04 10:00:31 crc kubenswrapper[4693]: I1204 10:00:31.214958 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7bce599-dd9d-43c5-b5a9-53a081b6f183-cert\") pod \"infra-operator-controller-manager-57548d458d-46wh4\" (UID: \"b7bce599-dd9d-43c5-b5a9-53a081b6f183\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" Dec 04 10:00:31 crc kubenswrapper[4693]: E1204 10:00:31.215389 4693 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 10:00:31 crc kubenswrapper[4693]: E1204 10:00:31.216439 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7bce599-dd9d-43c5-b5a9-53a081b6f183-cert podName:b7bce599-dd9d-43c5-b5a9-53a081b6f183 nodeName:}" failed. No retries permitted until 2025-12-04 10:00:39.21641388 +0000 UTC m=+1085.114007633 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7bce599-dd9d-43c5-b5a9-53a081b6f183-cert") pod "infra-operator-controller-manager-57548d458d-46wh4" (UID: "b7bce599-dd9d-43c5-b5a9-53a081b6f183") : secret "infra-operator-webhook-server-cert" not found Dec 04 10:00:31 crc kubenswrapper[4693]: I1204 10:00:31.723275 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7\" (UID: \"287b0c68-a203-4af6-b654-2eb97b004cdc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" Dec 04 10:00:31 crc kubenswrapper[4693]: E1204 10:00:31.723469 4693 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 10:00:31 crc kubenswrapper[4693]: E1204 10:00:31.723555 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert podName:287b0c68-a203-4af6-b654-2eb97b004cdc nodeName:}" failed. No retries permitted until 2025-12-04 10:00:39.72353351 +0000 UTC m=+1085.621127263 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" (UID: "287b0c68-a203-4af6-b654-2eb97b004cdc") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 10:00:32 crc kubenswrapper[4693]: I1204 10:00:32.027480 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:32 crc kubenswrapper[4693]: I1204 10:00:32.027534 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:32 crc kubenswrapper[4693]: E1204 10:00:32.027755 4693 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 10:00:32 crc kubenswrapper[4693]: E1204 10:00:32.027817 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs podName:9b4532d5-fce3-43a3-b72c-c0752eae7945 nodeName:}" failed. No retries permitted until 2025-12-04 10:00:40.027803064 +0000 UTC m=+1085.925396817 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs") pod "openstack-operator-controller-manager-5bf9d46bf4-jn6kv" (UID: "9b4532d5-fce3-43a3-b72c-c0752eae7945") : secret "webhook-server-cert" not found Dec 04 10:00:32 crc kubenswrapper[4693]: E1204 10:00:32.027858 4693 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 10:00:32 crc kubenswrapper[4693]: E1204 10:00:32.027877 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs podName:9b4532d5-fce3-43a3-b72c-c0752eae7945 nodeName:}" failed. No retries permitted until 2025-12-04 10:00:40.027869886 +0000 UTC m=+1085.925463639 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs") pod "openstack-operator-controller-manager-5bf9d46bf4-jn6kv" (UID: "9b4532d5-fce3-43a3-b72c-c0752eae7945") : secret "metrics-server-cert" not found Dec 04 10:00:39 crc kubenswrapper[4693]: I1204 10:00:39.244483 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7bce599-dd9d-43c5-b5a9-53a081b6f183-cert\") pod \"infra-operator-controller-manager-57548d458d-46wh4\" (UID: \"b7bce599-dd9d-43c5-b5a9-53a081b6f183\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" Dec 04 10:00:39 crc kubenswrapper[4693]: I1204 10:00:39.250606 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7bce599-dd9d-43c5-b5a9-53a081b6f183-cert\") pod \"infra-operator-controller-manager-57548d458d-46wh4\" (UID: \"b7bce599-dd9d-43c5-b5a9-53a081b6f183\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" Dec 04 10:00:39 crc kubenswrapper[4693]: I1204 10:00:39.441696 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-wj2vf" Dec 04 10:00:39 crc kubenswrapper[4693]: I1204 10:00:39.450836 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" Dec 04 10:00:39 crc kubenswrapper[4693]: E1204 10:00:39.512159 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 04 10:00:39 crc kubenswrapper[4693]: E1204 10:00:39.512375 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9tjw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-np6kl_openstack-operators(ae731a83-bab7-4843-b413-e8b03a3ca1c3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:00:39 crc kubenswrapper[4693]: I1204 10:00:39.752536 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7\" (UID: \"287b0c68-a203-4af6-b654-2eb97b004cdc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" Dec 04 10:00:39 crc kubenswrapper[4693]: E1204 10:00:39.752723 4693 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 10:00:39 crc kubenswrapper[4693]: E1204 10:00:39.752782 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert podName:287b0c68-a203-4af6-b654-2eb97b004cdc nodeName:}" failed. No retries permitted until 2025-12-04 10:00:55.75276704 +0000 UTC m=+1101.650360793 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" (UID: "287b0c68-a203-4af6-b654-2eb97b004cdc") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 10:00:40 crc kubenswrapper[4693]: I1204 10:00:40.056253 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:40 crc kubenswrapper[4693]: I1204 10:00:40.056644 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:40 crc kubenswrapper[4693]: E1204 10:00:40.056492 4693 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 10:00:40 crc kubenswrapper[4693]: E1204 10:00:40.056744 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs podName:9b4532d5-fce3-43a3-b72c-c0752eae7945 nodeName:}" failed. No retries permitted until 2025-12-04 10:00:56.056721925 +0000 UTC m=+1101.954315678 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs") pod "openstack-operator-controller-manager-5bf9d46bf4-jn6kv" (UID: "9b4532d5-fce3-43a3-b72c-c0752eae7945") : secret "metrics-server-cert" not found Dec 04 10:00:40 crc kubenswrapper[4693]: E1204 10:00:40.056771 4693 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 10:00:40 crc kubenswrapper[4693]: E1204 10:00:40.056819 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs podName:9b4532d5-fce3-43a3-b72c-c0752eae7945 nodeName:}" failed. No retries permitted until 2025-12-04 10:00:56.056803557 +0000 UTC m=+1101.954397310 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs") pod "openstack-operator-controller-manager-5bf9d46bf4-jn6kv" (UID: "9b4532d5-fce3-43a3-b72c-c0752eae7945") : secret "webhook-server-cert" not found Dec 04 10:00:40 crc kubenswrapper[4693]: E1204 10:00:40.673870 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 04 10:00:40 crc kubenswrapper[4693]: E1204 10:00:40.674030 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-559jg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-klr7c_openstack-operators(fc97bab3-20bf-4931-868d-a20ad433cc81): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:00:41 crc kubenswrapper[4693]: E1204 10:00:41.456592 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 04 10:00:41 crc kubenswrapper[4693]: E1204 10:00:41.456780 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m8mr8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-w7zh6_openstack-operators(92ac4c28-9d59-4955-b5cf-ae45e97fdeed): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:00:42 crc kubenswrapper[4693]: E1204 10:00:42.093789 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 04 10:00:42 crc kubenswrapper[4693]: E1204 10:00:42.094250 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dhnm2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-gtwgq_openstack-operators(65a2270f-58bd-486b-9be3-c85fee980070): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:00:45 crc kubenswrapper[4693]: I1204 10:00:45.768735 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-46wh4"] Dec 04 10:00:46 crc kubenswrapper[4693]: W1204 10:00:46.560774 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7bce599_dd9d_43c5_b5a9_53a081b6f183.slice/crio-9411d2fe53b9f4dc34f8af2f813f76bee8f5cf49e0c5db5d2d24f9dc0ccc85fd WatchSource:0}: Error finding container 9411d2fe53b9f4dc34f8af2f813f76bee8f5cf49e0c5db5d2d24f9dc0ccc85fd: Status 404 returned error can't find the container with id 9411d2fe53b9f4dc34f8af2f813f76bee8f5cf49e0c5db5d2d24f9dc0ccc85fd Dec 04 10:00:46 crc kubenswrapper[4693]: I1204 10:00:46.732495 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" event={"ID":"b7bce599-dd9d-43c5-b5a9-53a081b6f183","Type":"ContainerStarted","Data":"9411d2fe53b9f4dc34f8af2f813f76bee8f5cf49e0c5db5d2d24f9dc0ccc85fd"} Dec 04 10:00:50 crc kubenswrapper[4693]: I1204 10:00:50.870839 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzpj7" event={"ID":"383c7650-d095-4996-88c6-06d999b1973b","Type":"ContainerStarted","Data":"fce956a00ba4b88a0b5ff0750005024fed3e75d095e2cece3909ec44254c0d0b"} Dec 04 10:00:50 crc kubenswrapper[4693]: I1204 10:00:50.902791 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gfvq4" event={"ID":"23baa4a2-ca26-41a5-968c-f642ca80d1fa","Type":"ContainerStarted","Data":"e195e28c1fcdc453f74692230ac9b9f93a1d531af9256cfb54c5a9b9611534ac"} Dec 04 10:00:51 crc kubenswrapper[4693]: I1204 10:00:51.939865 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-jgmqz" event={"ID":"4ba18ef1-50c1-48d0-9d2e-3c83c65913ab","Type":"ContainerStarted","Data":"6f2a899287068ecaefd019e4c4477e67b97edbd1e639c1a4967af255107cf4cd"} Dec 04 10:00:51 crc kubenswrapper[4693]: I1204 10:00:51.943487 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q8crl" event={"ID":"2bb25289-630f-46c3-96f0-b5ea8177f5d8","Type":"ContainerStarted","Data":"4e672eb3a44a447dcb5ba4972f53186baeb4ecfb5a830e2094509d5c9db8dcb3"} Dec 04 10:00:51 crc kubenswrapper[4693]: I1204 10:00:51.946978 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4gqg9" event={"ID":"79466fca-aa64-407e-9488-d89e43d4bed9","Type":"ContainerStarted","Data":"7c931ba0b7df9de64cfbf082a48d5dd599a4ebea5d8a1ebadd52fb16de3ecd43"} Dec 04 10:00:51 crc kubenswrapper[4693]: I1204 10:00:51.950170 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vtgdw" event={"ID":"5aa92828-abd4-4f89-9621-5e9830101fca","Type":"ContainerStarted","Data":"77b1c6d2b5c87bdd8dc6a1397bec557fc5999270accf5a7d5521ed14285dbfea"} Dec 04 10:00:51 crc kubenswrapper[4693]: I1204 10:00:51.952660 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qcc4g" event={"ID":"37983465-c081-4645-9a0b-47431d284dbe","Type":"ContainerStarted","Data":"b84e3b19fb3c0547c2ba0444efaaf4cd664db1290e2db8d29fad3b46de84e75e"} Dec 04 10:00:51 crc kubenswrapper[4693]: I1204 10:00:51.954230 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-tnzrv" event={"ID":"7fb21378-fa3f-41a2-a6da-80831acec23c","Type":"ContainerStarted","Data":"c920f7aad9729e628967a3d708655b3f7f0246aa23b3933bee172e18426871f8"} Dec 04 10:00:51 crc kubenswrapper[4693]: I1204 10:00:51.955190 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zmpkb" event={"ID":"a37bfc80-1ecc-4547-8fbe-be223b9a5cc2","Type":"ContainerStarted","Data":"29d061afc19c9d701c5bdfd824aeaec8611e5cc806bb9ceb1da23f557afb3d2c"} Dec 04 10:00:51 crc kubenswrapper[4693]: I1204 10:00:51.956502 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-zg6gn" event={"ID":"9b4ce9a3-bc13-4726-af72-c0f4c619efec","Type":"ContainerStarted","Data":"8dff48d95d6655d85ce10eba8669331e9456d9c28f0474723fd2d3781c829f27"} Dec 04 10:00:52 crc kubenswrapper[4693]: I1204 10:00:52.977711 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-vfrzv" event={"ID":"b2d582c6-b444-4591-93c3-7681714732bc","Type":"ContainerStarted","Data":"7bb26e2cdf7d4b6dd0ccc013f2df04e1164ad28da41ed50b3c4abb35b597d7fb"} Dec 04 10:00:53 crc kubenswrapper[4693]: I1204 10:00:53.011081 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-5j4cs" event={"ID":"f3a27983-d919-48fb-a227-f6a45efef985","Type":"ContainerStarted","Data":"5e23c1286decfcffc551a77a88861793c9bf3a100862f19ec0028f89029c89ba"} Dec 04 10:00:53 crc kubenswrapper[4693]: I1204 10:00:53.016153 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4mp89" event={"ID":"1de6adcf-e847-4a10-af8c-683f83c32551","Type":"ContainerStarted","Data":"b1a98dcec01a4408869fec16b6a3f2206b53c8402d45b5bef7410032efe71040"} Dec 04 10:00:53 crc kubenswrapper[4693]: I1204 10:00:53.017033 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-cxrdm" event={"ID":"b352e856-6946-41ed-8d06-46b1ab00185e","Type":"ContainerStarted","Data":"2069fbb55e7e0ec2e44039ba1826148bb36c15131bf98f64918c4c902f80f6de"} Dec 04 10:00:54 crc kubenswrapper[4693]: I1204 10:00:54.027752 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4zbtc" event={"ID":"75a5a37b-eb32-4654-85f7-1c7b9de1c247","Type":"ContainerStarted","Data":"f835909a8a197265542e0b5ce2b145ca1bf2765b1404f9a81062b156000d6b76"} Dec 04 10:00:54 crc kubenswrapper[4693]: I1204 10:00:54.029371 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xzz9b" event={"ID":"4ffbc9ab-625c-467a-b3cb-017b4167d8a1","Type":"ContainerStarted","Data":"4b3a88503be70e15a6afce8f2d35b7bb14d1b7479ac9c34b8763e8b22b250359"} Dec 04 10:00:54 crc kubenswrapper[4693]: I1204 10:00:54.050460 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xzz9b" podStartSLOduration=5.182965023 podStartE2EDuration="30.050445361s" podCreationTimestamp="2025-12-04 10:00:24 +0000 UTC" firstStartedPulling="2025-12-04 10:00:25.791238536 +0000 UTC m=+1071.688832289" lastFinishedPulling="2025-12-04 10:00:50.658718874 +0000 UTC m=+1096.556312627" observedRunningTime="2025-12-04 10:00:54.047725558 +0000 UTC m=+1099.945319311" watchObservedRunningTime="2025-12-04 10:00:54.050445361 +0000 UTC m=+1099.948039114" Dec 04 10:00:55 crc kubenswrapper[4693]: I1204 10:00:55.781697 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7\" (UID: \"287b0c68-a203-4af6-b654-2eb97b004cdc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" Dec 04 10:00:55 crc kubenswrapper[4693]: I1204 10:00:55.787526 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/287b0c68-a203-4af6-b654-2eb97b004cdc-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7\" (UID: \"287b0c68-a203-4af6-b654-2eb97b004cdc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" Dec 04 10:00:56 crc kubenswrapper[4693]: I1204 10:00:56.039579 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-fjfsb" Dec 04 10:00:56 crc kubenswrapper[4693]: I1204 10:00:56.047783 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" Dec 04 10:00:56 crc kubenswrapper[4693]: I1204 10:00:56.094287 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:56 crc kubenswrapper[4693]: I1204 10:00:56.094342 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:56 crc kubenswrapper[4693]: I1204 10:00:56.100786 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-webhook-certs\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:56 crc kubenswrapper[4693]: I1204 10:00:56.108402 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4532d5-fce3-43a3-b72c-c0752eae7945-metrics-certs\") pod \"openstack-operator-controller-manager-5bf9d46bf4-jn6kv\" (UID: \"9b4532d5-fce3-43a3-b72c-c0752eae7945\") " pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:00:56 crc kubenswrapper[4693]: I1204 10:00:56.303046 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-czv5q" Dec 04 10:00:56 crc kubenswrapper[4693]: I1204 10:00:56.310953 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:01:14 crc kubenswrapper[4693]: E1204 10:01:14.897971 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 04 10:01:14 crc kubenswrapper[4693]: E1204 10:01:14.898630 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-559jg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-klr7c_openstack-operators(fc97bab3-20bf-4931-868d-a20ad433cc81): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:01:14 crc kubenswrapper[4693]: E1204 10:01:14.899858 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-klr7c" podUID="fc97bab3-20bf-4931-868d-a20ad433cc81" Dec 04 10:01:15 crc kubenswrapper[4693]: E1204 10:01:15.968697 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 04 10:01:15 crc kubenswrapper[4693]: E1204 10:01:15.968957 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kghfm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-pzpj7_openstack-operators(383c7650-d095-4996-88c6-06d999b1973b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:01:15 crc kubenswrapper[4693]: E1204 10:01:15.970141 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzpj7" podUID="383c7650-d095-4996-88c6-06d999b1973b" Dec 04 10:01:15 crc kubenswrapper[4693]: E1204 10:01:15.985776 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 04 10:01:15 crc kubenswrapper[4693]: E1204 10:01:15.985793 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 04 10:01:15 crc kubenswrapper[4693]: E1204 10:01:15.986007 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tbhql,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-qcc4g_openstack-operators(37983465-c081-4645-9a0b-47431d284dbe): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 04 10:01:15 crc kubenswrapper[4693]: E1204 10:01:15.986236 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bmtpz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-gfvq4_openstack-operators(23baa4a2-ca26-41a5-968c-f642ca80d1fa): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 04 10:01:15 crc kubenswrapper[4693]: E1204 10:01:15.987284 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qcc4g" podUID="37983465-c081-4645-9a0b-47431d284dbe" Dec 04 10:01:15 crc kubenswrapper[4693]: E1204 10:01:15.987392 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gfvq4" podUID="23baa4a2-ca26-41a5-968c-f642ca80d1fa" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.012413 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.012640 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pjr5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-tnzrv_openstack-operators(7fb21378-fa3f-41a2-a6da-80831acec23c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.016108 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-tnzrv" podUID="7fb21378-fa3f-41a2-a6da-80831acec23c" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.040816 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.041067 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f4mhn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-zmpkb_openstack-operators(a37bfc80-1ecc-4547-8fbe-be223b9a5cc2): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.042394 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zmpkb" podUID="a37bfc80-1ecc-4547-8fbe-be223b9a5cc2" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.043840 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.044032 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-29brj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7d9dfd778-vtgdw_openstack-operators(5aa92828-abd4-4f89-9621-5e9830101fca): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.045258 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vtgdw" podUID="5aa92828-abd4-4f89-9621-5e9830101fca" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.054515 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:09a6d0613ee2d3c1c809fc36c22678458ac271e0da87c970aec0a5339f5423f7" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.055018 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:09a6d0613ee2d3c1c809fc36c22678458ac271e0da87c970aec0a5339f5423f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v82wk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-57548d458d-46wh4_openstack-operators(b7bce599-dd9d-43c5-b5a9-53a081b6f183): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.149175 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.149663 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m8mr8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-w7zh6_openstack-operators(92ac4c28-9d59-4955-b5cf-ae45e97fdeed): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.151214 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-w7zh6" podUID="92ac4c28-9d59-4955-b5cf-ae45e97fdeed" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.163288 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.163486 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dhnm2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-gtwgq_openstack-operators(65a2270f-58bd-486b-9be3-c85fee980070): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.164873 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gtwgq" podUID="65a2270f-58bd-486b-9be3-c85fee980070" Dec 04 10:01:16 crc kubenswrapper[4693]: I1204 10:01:16.196814 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vtgdw" Dec 04 10:01:16 crc kubenswrapper[4693]: I1204 10:01:16.196845 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzpj7" Dec 04 10:01:16 crc kubenswrapper[4693]: I1204 10:01:16.200078 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzpj7" Dec 04 10:01:16 crc kubenswrapper[4693]: I1204 10:01:16.200589 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vtgdw" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.255098 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qcc4g" podUID="37983465-c081-4645-9a0b-47431d284dbe" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.350081 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vtgdw" podUID="5aa92828-abd4-4f89-9621-5e9830101fca" Dec 04 10:01:16 crc kubenswrapper[4693]: E1204 10:01:16.350266 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zmpkb" podUID="a37bfc80-1ecc-4547-8fbe-be223b9a5cc2" Dec 04 10:01:16 crc kubenswrapper[4693]: I1204 10:01:16.540930 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7"] Dec 04 10:01:16 crc kubenswrapper[4693]: I1204 10:01:16.576850 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv"] Dec 04 10:01:17 crc kubenswrapper[4693]: I1204 10:01:17.202808 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-tnzrv" event={"ID":"7fb21378-fa3f-41a2-a6da-80831acec23c","Type":"ContainerStarted","Data":"48d7b19cc9260742ed98b3ea3db3e363b2ad5851832ba6655237a228401e5200"} Dec 04 10:01:17 crc kubenswrapper[4693]: I1204 10:01:17.204635 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gfvq4" event={"ID":"23baa4a2-ca26-41a5-968c-f642ca80d1fa","Type":"ContainerStarted","Data":"8cb3485afcb9217811331953fa351ed8b936f79c3695262144f7dedabefe6037"} Dec 04 10:01:17 crc kubenswrapper[4693]: I1204 10:01:17.204940 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gfvq4" Dec 04 10:01:17 crc kubenswrapper[4693]: I1204 10:01:17.206574 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gfvq4" Dec 04 10:01:17 crc kubenswrapper[4693]: I1204 10:01:17.207483 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-zg6gn" event={"ID":"9b4ce9a3-bc13-4726-af72-c0f4c619efec","Type":"ContainerStarted","Data":"568c7de936cb36ca7c216563378d1228ffaf1db2f8bd0e06e11ad127ecff3fbc"} Dec 04 10:01:17 crc kubenswrapper[4693]: I1204 10:01:17.207663 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-zg6gn" Dec 04 10:01:17 crc kubenswrapper[4693]: I1204 10:01:17.209039 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-zg6gn" Dec 04 10:01:17 crc kubenswrapper[4693]: I1204 10:01:17.209423 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" event={"ID":"287b0c68-a203-4af6-b654-2eb97b004cdc","Type":"ContainerStarted","Data":"d1bdd6d2e9d1d1367ac167ad0dda65663958add6b1df189f295a1fd444b8ae73"} Dec 04 10:01:17 crc kubenswrapper[4693]: I1204 10:01:17.214699 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-klr7c" event={"ID":"fc97bab3-20bf-4931-868d-a20ad433cc81","Type":"ContainerStarted","Data":"bbfc971670f56db920dd89e943a9c6136c08edb5e49656ad4cf5850ff66c073a"} Dec 04 10:01:17 crc kubenswrapper[4693]: I1204 10:01:17.216633 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzpj7" event={"ID":"383c7650-d095-4996-88c6-06d999b1973b","Type":"ContainerStarted","Data":"417a2bfc5b0ec88a6b377b5a11d106e93110580b5686ac5c7d90224b077d299b"} Dec 04 10:01:17 crc kubenswrapper[4693]: I1204 10:01:17.217544 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" event={"ID":"9b4532d5-fce3-43a3-b72c-c0752eae7945","Type":"ContainerStarted","Data":"36a0b8c0c511c9568e34c4367f39217755d914a8b9c6aed87b1396d6a0248b0c"} Dec 04 10:01:17 crc kubenswrapper[4693]: I1204 10:01:17.219074 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4gqg9" event={"ID":"79466fca-aa64-407e-9488-d89e43d4bed9","Type":"ContainerStarted","Data":"fbfdbe63cc3f465c4425ab6f04eba87024d44f6b53fb8b918b6a9438db701255"} Dec 04 10:01:17 crc kubenswrapper[4693]: I1204 10:01:17.230975 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gfvq4" podStartSLOduration=37.81066295 podStartE2EDuration="54.230960282s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="2025-12-04 10:00:25.683739674 +0000 UTC m=+1071.581333427" lastFinishedPulling="2025-12-04 10:00:42.104037006 +0000 UTC m=+1088.001630759" observedRunningTime="2025-12-04 10:01:17.224690023 +0000 UTC m=+1123.122283776" watchObservedRunningTime="2025-12-04 10:01:17.230960282 +0000 UTC m=+1123.128554035" Dec 04 10:01:17 crc kubenswrapper[4693]: I1204 10:01:17.303985 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzpj7" podStartSLOduration=37.436562186 podStartE2EDuration="54.303966512s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="2025-12-04 10:00:25.741487169 +0000 UTC m=+1071.639080922" lastFinishedPulling="2025-12-04 10:00:42.608891495 +0000 UTC m=+1088.506485248" observedRunningTime="2025-12-04 10:01:17.251791617 +0000 UTC m=+1123.149385360" watchObservedRunningTime="2025-12-04 10:01:17.303966512 +0000 UTC m=+1123.201560265" Dec 04 10:01:17 crc kubenswrapper[4693]: I1204 10:01:17.351466 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-zg6gn" podStartSLOduration=3.582837234 podStartE2EDuration="54.351445s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="2025-12-04 10:00:25.683831267 +0000 UTC m=+1071.581425020" lastFinishedPulling="2025-12-04 10:01:16.452439033 +0000 UTC m=+1122.350032786" observedRunningTime="2025-12-04 10:01:17.344582464 +0000 UTC m=+1123.242176227" watchObservedRunningTime="2025-12-04 10:01:17.351445 +0000 UTC m=+1123.249038753" Dec 04 10:01:17 crc kubenswrapper[4693]: E1204 10:01:17.622461 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 04 10:01:17 crc kubenswrapper[4693]: E1204 10:01:17.622958 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9tjw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-np6kl_openstack-operators(ae731a83-bab7-4843-b413-e8b03a3ca1c3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:01:17 crc kubenswrapper[4693]: E1204 10:01:17.624968 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-np6kl" podUID="ae731a83-bab7-4843-b413-e8b03a3ca1c3" Dec 04 10:01:17 crc kubenswrapper[4693]: E1204 10:01:17.989808 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" podUID="b7bce599-dd9d-43c5-b5a9-53a081b6f183" Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.231296 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4zbtc" event={"ID":"75a5a37b-eb32-4654-85f7-1c7b9de1c247","Type":"ContainerStarted","Data":"4aa5f03cc8909e2485a5ec07937c8b13a63b2419b6d5b37fe09b4f169d6719a2"} Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.231398 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4zbtc" Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.233902 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-klr7c" event={"ID":"fc97bab3-20bf-4931-868d-a20ad433cc81","Type":"ContainerStarted","Data":"cc0742b95df257ab2876d40b0519f27d83ba269d7e6af98a182edf48247ee6c5"} Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.234128 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-klr7c" Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.234608 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4zbtc" Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.236600 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-cxrdm" event={"ID":"b352e856-6946-41ed-8d06-46b1ab00185e","Type":"ContainerStarted","Data":"9789c388021ff301a49a810d585a13e375f2cca43dfb9becec2a692cff674799"} Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.237257 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-cxrdm" Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.238559 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" event={"ID":"9b4532d5-fce3-43a3-b72c-c0752eae7945","Type":"ContainerStarted","Data":"6953feb499af08db26fb52701e3a48c05731e3d46ce99a0bb09ba6158e0b0dcf"} Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.239440 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.241279 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-cxrdm" Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.242963 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vtgdw" event={"ID":"5aa92828-abd4-4f89-9621-5e9830101fca","Type":"ContainerStarted","Data":"de11064f0894bbf4598409084acd7ca7ab72678c4cda11988f776f8c902dc3d6"} Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.251149 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4zbtc" podStartSLOduration=3.521568138 podStartE2EDuration="55.251107426s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="2025-12-04 10:00:25.788876763 +0000 UTC m=+1071.686470516" lastFinishedPulling="2025-12-04 10:01:17.518416011 +0000 UTC m=+1123.416009804" observedRunningTime="2025-12-04 10:01:18.248438183 +0000 UTC m=+1124.146031936" watchObservedRunningTime="2025-12-04 10:01:18.251107426 +0000 UTC m=+1124.148701179" Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.255874 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" event={"ID":"b7bce599-dd9d-43c5-b5a9-53a081b6f183","Type":"ContainerStarted","Data":"48bbceb3ad8d9c66b49b6fa27c94f172c127a10751aa4da24a373e592e0918b8"} Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.256031 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-tnzrv" Dec 04 10:01:18 crc kubenswrapper[4693]: E1204 10:01:18.256711 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:09a6d0613ee2d3c1c809fc36c22678458ac271e0da87c970aec0a5339f5423f7\\\"\"" pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" podUID="b7bce599-dd9d-43c5-b5a9-53a081b6f183" Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.268650 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-tnzrv" Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.271714 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-vtgdw" podStartSLOduration=36.424957974 podStartE2EDuration="56.271697555s" podCreationTimestamp="2025-12-04 10:00:22 +0000 UTC" firstStartedPulling="2025-12-04 10:00:24.902865406 +0000 UTC m=+1070.800459169" lastFinishedPulling="2025-12-04 10:00:44.749604997 +0000 UTC m=+1090.647198750" observedRunningTime="2025-12-04 10:01:18.268900839 +0000 UTC m=+1124.166494592" watchObservedRunningTime="2025-12-04 10:01:18.271697555 +0000 UTC m=+1124.169291308" Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.369775 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" podStartSLOduration=55.369758865 podStartE2EDuration="55.369758865s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:01:18.367487833 +0000 UTC m=+1124.265081586" watchObservedRunningTime="2025-12-04 10:01:18.369758865 +0000 UTC m=+1124.267352618" Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.404057 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-klr7c" podStartSLOduration=4.126878505 podStartE2EDuration="55.404038034s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="2025-12-04 10:00:25.250575177 +0000 UTC m=+1071.148168920" lastFinishedPulling="2025-12-04 10:01:16.527734706 +0000 UTC m=+1122.425328449" observedRunningTime="2025-12-04 10:01:18.388733049 +0000 UTC m=+1124.286326812" watchObservedRunningTime="2025-12-04 10:01:18.404038034 +0000 UTC m=+1124.301631787" Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.432651 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-cxrdm" podStartSLOduration=4.485544432 podStartE2EDuration="55.432637151s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="2025-12-04 10:00:25.689581742 +0000 UTC m=+1071.587175485" lastFinishedPulling="2025-12-04 10:01:16.636674451 +0000 UTC m=+1122.534268204" observedRunningTime="2025-12-04 10:01:18.42891934 +0000 UTC m=+1124.326513093" watchObservedRunningTime="2025-12-04 10:01:18.432637151 +0000 UTC m=+1124.330230904" Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.530203 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4gqg9" podStartSLOduration=4.033376837 podStartE2EDuration="55.530185396s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="2025-12-04 10:00:25.031156053 +0000 UTC m=+1070.928749806" lastFinishedPulling="2025-12-04 10:01:16.527964612 +0000 UTC m=+1122.425558365" observedRunningTime="2025-12-04 10:01:18.511779057 +0000 UTC m=+1124.409372810" watchObservedRunningTime="2025-12-04 10:01:18.530185396 +0000 UTC m=+1124.427779149" Dec 04 10:01:18 crc kubenswrapper[4693]: I1204 10:01:18.543433 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-tnzrv" podStartSLOduration=35.843645947 podStartE2EDuration="55.543412576s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="2025-12-04 10:00:25.050230059 +0000 UTC m=+1070.947823812" lastFinishedPulling="2025-12-04 10:00:44.749996688 +0000 UTC m=+1090.647590441" observedRunningTime="2025-12-04 10:01:18.536597591 +0000 UTC m=+1124.434191344" watchObservedRunningTime="2025-12-04 10:01:18.543412576 +0000 UTC m=+1124.441006329" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.273630 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-vfrzv" event={"ID":"b2d582c6-b444-4591-93c3-7681714732bc","Type":"ContainerStarted","Data":"9ada8d211d5d8e32daf7b50824c7b28886d958ffd8cca0ccb3598e6ddc8bf013"} Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.274845 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-vfrzv" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.277939 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-vfrzv" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.285506 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-w7zh6" event={"ID":"92ac4c28-9d59-4955-b5cf-ae45e97fdeed","Type":"ContainerStarted","Data":"66d999854fa94b3925cee743cf3072a27ba289551865b078f6e881de0cd04c04"} Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.285552 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-w7zh6" event={"ID":"92ac4c28-9d59-4955-b5cf-ae45e97fdeed","Type":"ContainerStarted","Data":"2b6fbf3284d21dcb4ad78aff272384c97165b2669066e36fca742f448b96d220"} Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.286143 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-w7zh6" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.302042 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-vfrzv" podStartSLOduration=4.484362559 podStartE2EDuration="56.302020634s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="2025-12-04 10:00:25.694816164 +0000 UTC m=+1071.592409917" lastFinishedPulling="2025-12-04 10:01:17.512474249 +0000 UTC m=+1123.410067992" observedRunningTime="2025-12-04 10:01:19.301752577 +0000 UTC m=+1125.199346350" watchObservedRunningTime="2025-12-04 10:01:19.302020634 +0000 UTC m=+1125.199614387" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.315562 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gtwgq" event={"ID":"65a2270f-58bd-486b-9be3-c85fee980070","Type":"ContainerStarted","Data":"fbd7e8aaf9ee1c00e7a7b74225156d5fcd46e644edaaa5692eb570d295e64ba2"} Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.315624 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gtwgq" event={"ID":"65a2270f-58bd-486b-9be3-c85fee980070","Type":"ContainerStarted","Data":"70343972b94c512586f3901ad70400ce3021d90e7330238dafdb395d0e39469a"} Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.316153 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gtwgq" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.335317 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-5j4cs" event={"ID":"f3a27983-d919-48fb-a227-f6a45efef985","Type":"ContainerStarted","Data":"bac50a2c61d55547d108afb67fe772e19ebdd534dc2877f11be674e69a07d37e"} Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.339662 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-5j4cs" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.342897 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-jgmqz" event={"ID":"4ba18ef1-50c1-48d0-9d2e-3c83c65913ab","Type":"ContainerStarted","Data":"10a220741ae9c01abf06bafb5b32b4b9602f7087a1cb4bc8aff659b118363a5b"} Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.365662 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-jgmqz" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.365770 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-5j4cs" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.385060 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-w7zh6" podStartSLOduration=3.5404825090000003 podStartE2EDuration="56.385032056s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="2025-12-04 10:00:25.241412629 +0000 UTC m=+1071.139006382" lastFinishedPulling="2025-12-04 10:01:18.085962176 +0000 UTC m=+1123.983555929" observedRunningTime="2025-12-04 10:01:19.374218293 +0000 UTC m=+1125.271812056" watchObservedRunningTime="2025-12-04 10:01:19.385032056 +0000 UTC m=+1125.282625809" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.390742 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-jgmqz" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.390787 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4mp89" event={"ID":"1de6adcf-e847-4a10-af8c-683f83c32551","Type":"ContainerStarted","Data":"285a4a702557eed8b55757e32bb3e6de3f72085a988d08fd1546fe71a387adf3"} Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.391149 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4mp89" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.448374 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4mp89" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.452262 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q8crl" event={"ID":"2bb25289-630f-46c3-96f0-b5ea8177f5d8","Type":"ContainerStarted","Data":"26a244ce120ac020cc4d29675fd0b42e1ef6538bcf6c6c3ce4b047c99e06f227"} Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.455489 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q8crl" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.458251 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q8crl" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.458838 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-np6kl" event={"ID":"ae731a83-bab7-4843-b413-e8b03a3ca1c3","Type":"ContainerStarted","Data":"e999098318836fa2d4d99dc81c6060283847ad2073c42ffd6a50ae516be80691"} Dec 04 10:01:19 crc kubenswrapper[4693]: E1204 10:01:19.475042 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:09a6d0613ee2d3c1c809fc36c22678458ac271e0da87c970aec0a5339f5423f7\\\"\"" pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" podUID="b7bce599-dd9d-43c5-b5a9-53a081b6f183" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.481739 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-jgmqz" podStartSLOduration=3.882988006 podStartE2EDuration="56.481722359s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="2025-12-04 10:00:25.044963326 +0000 UTC m=+1070.942557079" lastFinishedPulling="2025-12-04 10:01:17.643697679 +0000 UTC m=+1123.541291432" observedRunningTime="2025-12-04 10:01:19.448448166 +0000 UTC m=+1125.346041919" watchObservedRunningTime="2025-12-04 10:01:19.481722359 +0000 UTC m=+1125.379316112" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.503700 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-5j4cs" podStartSLOduration=5.174232439 podStartE2EDuration="57.503681274s" podCreationTimestamp="2025-12-04 10:00:22 +0000 UTC" firstStartedPulling="2025-12-04 10:00:25.274123115 +0000 UTC m=+1071.171716868" lastFinishedPulling="2025-12-04 10:01:17.60357195 +0000 UTC m=+1123.501165703" observedRunningTime="2025-12-04 10:01:19.48026116 +0000 UTC m=+1125.377854923" watchObservedRunningTime="2025-12-04 10:01:19.503681274 +0000 UTC m=+1125.401275027" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.552386 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gtwgq" podStartSLOduration=4.158397472 podStartE2EDuration="56.552368285s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="2025-12-04 10:00:25.229288241 +0000 UTC m=+1071.126881994" lastFinishedPulling="2025-12-04 10:01:17.623259054 +0000 UTC m=+1123.520852807" observedRunningTime="2025-12-04 10:01:19.549685272 +0000 UTC m=+1125.447279015" watchObservedRunningTime="2025-12-04 10:01:19.552368285 +0000 UTC m=+1125.449962038" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.614733 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4mp89" podStartSLOduration=4.029448506 podStartE2EDuration="56.614711667s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="2025-12-04 10:00:25.788653757 +0000 UTC m=+1071.686247510" lastFinishedPulling="2025-12-04 10:01:18.373916918 +0000 UTC m=+1124.271510671" observedRunningTime="2025-12-04 10:01:19.58205702 +0000 UTC m=+1125.479650773" watchObservedRunningTime="2025-12-04 10:01:19.614711667 +0000 UTC m=+1125.512305420" Dec 04 10:01:19 crc kubenswrapper[4693]: I1204 10:01:19.652806 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q8crl" podStartSLOduration=3.479845804 podStartE2EDuration="56.652788429s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="2025-12-04 10:00:24.912639001 +0000 UTC m=+1070.810232754" lastFinishedPulling="2025-12-04 10:01:18.085581616 +0000 UTC m=+1123.983175379" observedRunningTime="2025-12-04 10:01:19.615778606 +0000 UTC m=+1125.513372359" watchObservedRunningTime="2025-12-04 10:01:19.652788429 +0000 UTC m=+1125.550382182" Dec 04 10:01:20 crc kubenswrapper[4693]: I1204 10:01:20.525464 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-np6kl" event={"ID":"ae731a83-bab7-4843-b413-e8b03a3ca1c3","Type":"ContainerStarted","Data":"a6e71f0328049dacce6735f3da1249ed7dd5a5cb98677c5dcaf39596707944e1"} Dec 04 10:01:20 crc kubenswrapper[4693]: I1204 10:01:20.540187 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-np6kl" podStartSLOduration=3.73695262 podStartE2EDuration="57.540166012s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="2025-12-04 10:00:25.222891267 +0000 UTC m=+1071.120485020" lastFinishedPulling="2025-12-04 10:01:19.026104669 +0000 UTC m=+1124.923698412" observedRunningTime="2025-12-04 10:01:20.539612786 +0000 UTC m=+1126.437206539" watchObservedRunningTime="2025-12-04 10:01:20.540166012 +0000 UTC m=+1126.437759765" Dec 04 10:01:21 crc kubenswrapper[4693]: I1204 10:01:21.563247 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-np6kl" Dec 04 10:01:23 crc kubenswrapper[4693]: I1204 10:01:23.552475 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4gqg9" Dec 04 10:01:23 crc kubenswrapper[4693]: I1204 10:01:23.553025 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-4gqg9" Dec 04 10:01:23 crc kubenswrapper[4693]: I1204 10:01:23.590830 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" event={"ID":"287b0c68-a203-4af6-b654-2eb97b004cdc","Type":"ContainerStarted","Data":"806e319e98685b1a9aa2d2a0e0e7359d871092819edc1de03367f8313b6a41b6"} Dec 04 10:01:23 crc kubenswrapper[4693]: I1204 10:01:23.590901 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" event={"ID":"287b0c68-a203-4af6-b654-2eb97b004cdc","Type":"ContainerStarted","Data":"0a83f20d0ce1d46434191e3b6be993f4afd1fad5b9f56bcfe214c5f84672e8bc"} Dec 04 10:01:23 crc kubenswrapper[4693]: I1204 10:01:23.590925 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" Dec 04 10:01:23 crc kubenswrapper[4693]: I1204 10:01:23.626305 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" podStartSLOduration=54.757276139 podStartE2EDuration="1m0.626288229s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="2025-12-04 10:01:16.590025865 +0000 UTC m=+1122.487619618" lastFinishedPulling="2025-12-04 10:01:22.459037965 +0000 UTC m=+1128.356631708" observedRunningTime="2025-12-04 10:01:23.621474818 +0000 UTC m=+1129.519068571" watchObservedRunningTime="2025-12-04 10:01:23.626288229 +0000 UTC m=+1129.523881982" Dec 04 10:01:23 crc kubenswrapper[4693]: I1204 10:01:23.842314 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-w7zh6" Dec 04 10:01:23 crc kubenswrapper[4693]: I1204 10:01:23.921402 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qcc4g" Dec 04 10:01:23 crc kubenswrapper[4693]: I1204 10:01:23.923405 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qcc4g" Dec 04 10:01:23 crc kubenswrapper[4693]: I1204 10:01:23.947422 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gtwgq" Dec 04 10:01:24 crc kubenswrapper[4693]: I1204 10:01:24.191288 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-klr7c" Dec 04 10:01:24 crc kubenswrapper[4693]: I1204 10:01:24.318109 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zmpkb" Dec 04 10:01:24 crc kubenswrapper[4693]: I1204 10:01:24.321044 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zmpkb" Dec 04 10:01:24 crc kubenswrapper[4693]: I1204 10:01:24.600300 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qcc4g" event={"ID":"37983465-c081-4645-9a0b-47431d284dbe","Type":"ContainerStarted","Data":"caa78fadfc9c6571737d085967b1972fd80d77aa04fc6831b6bb15d79450eb38"} Dec 04 10:01:24 crc kubenswrapper[4693]: I1204 10:01:24.618553 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-qcc4g" podStartSLOduration=44.749148576 podStartE2EDuration="1m1.618534386s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="2025-12-04 10:00:25.234684647 +0000 UTC m=+1071.132278400" lastFinishedPulling="2025-12-04 10:00:42.104070457 +0000 UTC m=+1088.001664210" observedRunningTime="2025-12-04 10:01:24.614967849 +0000 UTC m=+1130.512561602" watchObservedRunningTime="2025-12-04 10:01:24.618534386 +0000 UTC m=+1130.516128139" Dec 04 10:01:25 crc kubenswrapper[4693]: I1204 10:01:25.611160 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zmpkb" event={"ID":"a37bfc80-1ecc-4547-8fbe-be223b9a5cc2","Type":"ContainerStarted","Data":"8794dd04bf972df0e9022e9eae11fa0cc2896262c46c477bf9ded1f5a78075d2"} Dec 04 10:01:25 crc kubenswrapper[4693]: I1204 10:01:25.631674 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-zmpkb" podStartSLOduration=41.946316167 podStartE2EDuration="1m2.631656619s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="2025-12-04 10:00:25.788453571 +0000 UTC m=+1071.686047324" lastFinishedPulling="2025-12-04 10:00:46.473794023 +0000 UTC m=+1092.371387776" observedRunningTime="2025-12-04 10:01:25.629897181 +0000 UTC m=+1131.527490934" watchObservedRunningTime="2025-12-04 10:01:25.631656619 +0000 UTC m=+1131.529250372" Dec 04 10:01:26 crc kubenswrapper[4693]: I1204 10:01:26.316620 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5bf9d46bf4-jn6kv" Dec 04 10:01:33 crc kubenswrapper[4693]: I1204 10:01:33.789594 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-np6kl" Dec 04 10:01:34 crc kubenswrapper[4693]: I1204 10:01:34.675465 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" event={"ID":"b7bce599-dd9d-43c5-b5a9-53a081b6f183","Type":"ContainerStarted","Data":"e195b59134c427ec9b94dc112130c9f9611ef8e12d98923c98357e8c12926f69"} Dec 04 10:01:35 crc kubenswrapper[4693]: I1204 10:01:35.696568 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" Dec 04 10:01:35 crc kubenswrapper[4693]: I1204 10:01:35.714316 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" podStartSLOduration=25.181542286 podStartE2EDuration="1m12.714291989s" podCreationTimestamp="2025-12-04 10:00:23 +0000 UTC" firstStartedPulling="2025-12-04 10:00:46.592614563 +0000 UTC m=+1092.490208316" lastFinishedPulling="2025-12-04 10:01:34.125364266 +0000 UTC m=+1140.022958019" observedRunningTime="2025-12-04 10:01:35.709699294 +0000 UTC m=+1141.607293087" watchObservedRunningTime="2025-12-04 10:01:35.714291989 +0000 UTC m=+1141.611885752" Dec 04 10:01:36 crc kubenswrapper[4693]: I1204 10:01:36.054717 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7" Dec 04 10:01:39 crc kubenswrapper[4693]: I1204 10:01:39.458267 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-46wh4" Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.688514 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-98nc2"] Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.696965 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-98nc2" Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.706999 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.711120 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-kdmsk" Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.769152 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-98nc2"] Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.780468 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/629d83c6-3184-44ea-a2c5-5335c4acc9a3-config\") pod \"dnsmasq-dns-675f4bcbfc-98nc2\" (UID: \"629d83c6-3184-44ea-a2c5-5335c4acc9a3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-98nc2" Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.780534 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpx77\" (UniqueName: \"kubernetes.io/projected/629d83c6-3184-44ea-a2c5-5335c4acc9a3-kube-api-access-zpx77\") pod \"dnsmasq-dns-675f4bcbfc-98nc2\" (UID: \"629d83c6-3184-44ea-a2c5-5335c4acc9a3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-98nc2" Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.789472 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-k7gd8"] Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.791038 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-k7gd8" Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.795344 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.799919 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-k7gd8"] Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.882379 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed22773d-6d16-48d9-87a7-b5b7aac712fd-config\") pod \"dnsmasq-dns-78dd6ddcc-k7gd8\" (UID: \"ed22773d-6d16-48d9-87a7-b5b7aac712fd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k7gd8" Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.882583 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/629d83c6-3184-44ea-a2c5-5335c4acc9a3-config\") pod \"dnsmasq-dns-675f4bcbfc-98nc2\" (UID: \"629d83c6-3184-44ea-a2c5-5335c4acc9a3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-98nc2" Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.882632 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpx77\" (UniqueName: \"kubernetes.io/projected/629d83c6-3184-44ea-a2c5-5335c4acc9a3-kube-api-access-zpx77\") pod \"dnsmasq-dns-675f4bcbfc-98nc2\" (UID: \"629d83c6-3184-44ea-a2c5-5335c4acc9a3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-98nc2" Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.882686 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed22773d-6d16-48d9-87a7-b5b7aac712fd-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-k7gd8\" (UID: \"ed22773d-6d16-48d9-87a7-b5b7aac712fd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k7gd8" Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.882775 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmc6g\" (UniqueName: \"kubernetes.io/projected/ed22773d-6d16-48d9-87a7-b5b7aac712fd-kube-api-access-jmc6g\") pod \"dnsmasq-dns-78dd6ddcc-k7gd8\" (UID: \"ed22773d-6d16-48d9-87a7-b5b7aac712fd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k7gd8" Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.883506 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/629d83c6-3184-44ea-a2c5-5335c4acc9a3-config\") pod \"dnsmasq-dns-675f4bcbfc-98nc2\" (UID: \"629d83c6-3184-44ea-a2c5-5335c4acc9a3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-98nc2" Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.905621 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpx77\" (UniqueName: \"kubernetes.io/projected/629d83c6-3184-44ea-a2c5-5335c4acc9a3-kube-api-access-zpx77\") pod \"dnsmasq-dns-675f4bcbfc-98nc2\" (UID: \"629d83c6-3184-44ea-a2c5-5335c4acc9a3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-98nc2" Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.986443 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmc6g\" (UniqueName: \"kubernetes.io/projected/ed22773d-6d16-48d9-87a7-b5b7aac712fd-kube-api-access-jmc6g\") pod \"dnsmasq-dns-78dd6ddcc-k7gd8\" (UID: \"ed22773d-6d16-48d9-87a7-b5b7aac712fd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k7gd8" Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.986532 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed22773d-6d16-48d9-87a7-b5b7aac712fd-config\") pod \"dnsmasq-dns-78dd6ddcc-k7gd8\" (UID: \"ed22773d-6d16-48d9-87a7-b5b7aac712fd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k7gd8" Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.986620 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed22773d-6d16-48d9-87a7-b5b7aac712fd-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-k7gd8\" (UID: \"ed22773d-6d16-48d9-87a7-b5b7aac712fd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k7gd8" Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.988064 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed22773d-6d16-48d9-87a7-b5b7aac712fd-config\") pod \"dnsmasq-dns-78dd6ddcc-k7gd8\" (UID: \"ed22773d-6d16-48d9-87a7-b5b7aac712fd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k7gd8" Dec 04 10:01:54 crc kubenswrapper[4693]: I1204 10:01:54.989011 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed22773d-6d16-48d9-87a7-b5b7aac712fd-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-k7gd8\" (UID: \"ed22773d-6d16-48d9-87a7-b5b7aac712fd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k7gd8" Dec 04 10:01:55 crc kubenswrapper[4693]: I1204 10:01:55.025522 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmc6g\" (UniqueName: \"kubernetes.io/projected/ed22773d-6d16-48d9-87a7-b5b7aac712fd-kube-api-access-jmc6g\") pod \"dnsmasq-dns-78dd6ddcc-k7gd8\" (UID: \"ed22773d-6d16-48d9-87a7-b5b7aac712fd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k7gd8" Dec 04 10:01:55 crc kubenswrapper[4693]: I1204 10:01:55.052849 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-98nc2" Dec 04 10:01:55 crc kubenswrapper[4693]: I1204 10:01:55.110961 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-k7gd8" Dec 04 10:01:55 crc kubenswrapper[4693]: I1204 10:01:55.908427 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-98nc2"] Dec 04 10:01:56 crc kubenswrapper[4693]: I1204 10:01:56.006483 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-k7gd8"] Dec 04 10:01:56 crc kubenswrapper[4693]: W1204 10:01:56.009153 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded22773d_6d16_48d9_87a7_b5b7aac712fd.slice/crio-61d813c312f8142b790c59cc62e85601655cf976b642f8532d6369cabbe70cc3 WatchSource:0}: Error finding container 61d813c312f8142b790c59cc62e85601655cf976b642f8532d6369cabbe70cc3: Status 404 returned error can't find the container with id 61d813c312f8142b790c59cc62e85601655cf976b642f8532d6369cabbe70cc3 Dec 04 10:01:56 crc kubenswrapper[4693]: I1204 10:01:56.848092 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-k7gd8" event={"ID":"ed22773d-6d16-48d9-87a7-b5b7aac712fd","Type":"ContainerStarted","Data":"61d813c312f8142b790c59cc62e85601655cf976b642f8532d6369cabbe70cc3"} Dec 04 10:01:56 crc kubenswrapper[4693]: I1204 10:01:56.849488 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-98nc2" event={"ID":"629d83c6-3184-44ea-a2c5-5335c4acc9a3","Type":"ContainerStarted","Data":"1f98e13bafd57b0c49b159a62adfc590cb5481848cc7d77b6d9224ffd643177b"} Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.118735 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-98nc2"] Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.149604 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jl2ft"] Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.151591 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jl2ft" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.163913 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jl2ft"] Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.340277 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a-dns-svc\") pod \"dnsmasq-dns-666b6646f7-jl2ft\" (UID: \"3e20e730-0b42-4aad-bcb0-6f0b77c04c3a\") " pod="openstack/dnsmasq-dns-666b6646f7-jl2ft" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.340373 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk9zl\" (UniqueName: \"kubernetes.io/projected/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a-kube-api-access-vk9zl\") pod \"dnsmasq-dns-666b6646f7-jl2ft\" (UID: \"3e20e730-0b42-4aad-bcb0-6f0b77c04c3a\") " pod="openstack/dnsmasq-dns-666b6646f7-jl2ft" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.340421 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a-config\") pod \"dnsmasq-dns-666b6646f7-jl2ft\" (UID: \"3e20e730-0b42-4aad-bcb0-6f0b77c04c3a\") " pod="openstack/dnsmasq-dns-666b6646f7-jl2ft" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.441901 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk9zl\" (UniqueName: \"kubernetes.io/projected/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a-kube-api-access-vk9zl\") pod \"dnsmasq-dns-666b6646f7-jl2ft\" (UID: \"3e20e730-0b42-4aad-bcb0-6f0b77c04c3a\") " pod="openstack/dnsmasq-dns-666b6646f7-jl2ft" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.442012 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a-config\") pod \"dnsmasq-dns-666b6646f7-jl2ft\" (UID: \"3e20e730-0b42-4aad-bcb0-6f0b77c04c3a\") " pod="openstack/dnsmasq-dns-666b6646f7-jl2ft" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.442107 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a-dns-svc\") pod \"dnsmasq-dns-666b6646f7-jl2ft\" (UID: \"3e20e730-0b42-4aad-bcb0-6f0b77c04c3a\") " pod="openstack/dnsmasq-dns-666b6646f7-jl2ft" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.442970 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a-config\") pod \"dnsmasq-dns-666b6646f7-jl2ft\" (UID: \"3e20e730-0b42-4aad-bcb0-6f0b77c04c3a\") " pod="openstack/dnsmasq-dns-666b6646f7-jl2ft" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.443114 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a-dns-svc\") pod \"dnsmasq-dns-666b6646f7-jl2ft\" (UID: \"3e20e730-0b42-4aad-bcb0-6f0b77c04c3a\") " pod="openstack/dnsmasq-dns-666b6646f7-jl2ft" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.453910 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-k7gd8"] Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.480502 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk9zl\" (UniqueName: \"kubernetes.io/projected/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a-kube-api-access-vk9zl\") pod \"dnsmasq-dns-666b6646f7-jl2ft\" (UID: \"3e20e730-0b42-4aad-bcb0-6f0b77c04c3a\") " pod="openstack/dnsmasq-dns-666b6646f7-jl2ft" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.482917 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9js7w"] Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.489240 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9js7w" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.507411 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9js7w"] Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.644764 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ecabc57-5f50-49ef-8d1b-c8daa268642f-config\") pod \"dnsmasq-dns-57d769cc4f-9js7w\" (UID: \"0ecabc57-5f50-49ef-8d1b-c8daa268642f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9js7w" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.644842 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ecabc57-5f50-49ef-8d1b-c8daa268642f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9js7w\" (UID: \"0ecabc57-5f50-49ef-8d1b-c8daa268642f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9js7w" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.644908 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlds9\" (UniqueName: \"kubernetes.io/projected/0ecabc57-5f50-49ef-8d1b-c8daa268642f-kube-api-access-wlds9\") pod \"dnsmasq-dns-57d769cc4f-9js7w\" (UID: \"0ecabc57-5f50-49ef-8d1b-c8daa268642f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9js7w" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.746214 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlds9\" (UniqueName: \"kubernetes.io/projected/0ecabc57-5f50-49ef-8d1b-c8daa268642f-kube-api-access-wlds9\") pod \"dnsmasq-dns-57d769cc4f-9js7w\" (UID: \"0ecabc57-5f50-49ef-8d1b-c8daa268642f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9js7w" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.746663 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ecabc57-5f50-49ef-8d1b-c8daa268642f-config\") pod \"dnsmasq-dns-57d769cc4f-9js7w\" (UID: \"0ecabc57-5f50-49ef-8d1b-c8daa268642f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9js7w" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.746779 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ecabc57-5f50-49ef-8d1b-c8daa268642f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9js7w\" (UID: \"0ecabc57-5f50-49ef-8d1b-c8daa268642f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9js7w" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.747704 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ecabc57-5f50-49ef-8d1b-c8daa268642f-config\") pod \"dnsmasq-dns-57d769cc4f-9js7w\" (UID: \"0ecabc57-5f50-49ef-8d1b-c8daa268642f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9js7w" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.748260 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ecabc57-5f50-49ef-8d1b-c8daa268642f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9js7w\" (UID: \"0ecabc57-5f50-49ef-8d1b-c8daa268642f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9js7w" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.765307 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlds9\" (UniqueName: \"kubernetes.io/projected/0ecabc57-5f50-49ef-8d1b-c8daa268642f-kube-api-access-wlds9\") pod \"dnsmasq-dns-57d769cc4f-9js7w\" (UID: \"0ecabc57-5f50-49ef-8d1b-c8daa268642f\") " pod="openstack/dnsmasq-dns-57d769cc4f-9js7w" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.773007 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jl2ft" Dec 04 10:01:57 crc kubenswrapper[4693]: I1204 10:01:57.827630 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9js7w" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.350587 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.355471 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.359179 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.359264 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.359589 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.360048 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.360270 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jzl54" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.360538 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.361101 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.360965 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.379604 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jl2ft"] Dec 04 10:01:58 crc kubenswrapper[4693]: W1204 10:01:58.405258 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e20e730_0b42_4aad_bcb0_6f0b77c04c3a.slice/crio-c262cd2d8f2f88b89e37f59d85176423f77dfc59a448995dd99d5c29bb0738ab WatchSource:0}: Error finding container c262cd2d8f2f88b89e37f59d85176423f77dfc59a448995dd99d5c29bb0738ab: Status 404 returned error can't find the container with id c262cd2d8f2f88b89e37f59d85176423f77dfc59a448995dd99d5c29bb0738ab Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.504029 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.504107 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.504205 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.504324 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.504409 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.504453 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.504481 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.504585 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-config-data\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.504667 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p6pn\" (UniqueName: \"kubernetes.io/projected/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-kube-api-access-6p6pn\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.504704 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.504755 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.551277 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9js7w"] Dec 04 10:01:58 crc kubenswrapper[4693]: W1204 10:01:58.567374 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ecabc57_5f50_49ef_8d1b_c8daa268642f.slice/crio-44dddc0587ef9ccc05e7966f7571bc7e084cf397cfab99e9458e591799c0f65b WatchSource:0}: Error finding container 44dddc0587ef9ccc05e7966f7571bc7e084cf397cfab99e9458e591799c0f65b: Status 404 returned error can't find the container with id 44dddc0587ef9ccc05e7966f7571bc7e084cf397cfab99e9458e591799c0f65b Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.607280 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-config-data\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.610529 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p6pn\" (UniqueName: \"kubernetes.io/projected/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-kube-api-access-6p6pn\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.610576 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.610597 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.610613 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.610642 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.610663 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.610706 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.610768 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.610786 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.610810 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.608249 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-config-data\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.614268 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.615258 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.618051 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.618938 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.622351 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.622649 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.623290 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.624730 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.633740 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.635535 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.642732 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.646971 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.647605 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.647848 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.647959 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-r5rkb" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.648098 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.648090 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p6pn\" (UniqueName: \"kubernetes.io/projected/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-kube-api-access-6p6pn\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.648373 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.648502 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.648622 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.655065 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.696826 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.814824 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.814903 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0f073022-a55b-4a76-8fbd-92df61f2d38b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.814980 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khw8b\" (UniqueName: \"kubernetes.io/projected/0f073022-a55b-4a76-8fbd-92df61f2d38b-kube-api-access-khw8b\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.815004 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.815029 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.815064 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.815087 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.815109 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.815133 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0f073022-a55b-4a76-8fbd-92df61f2d38b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.815154 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.815181 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.912912 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jl2ft" event={"ID":"3e20e730-0b42-4aad-bcb0-6f0b77c04c3a","Type":"ContainerStarted","Data":"c262cd2d8f2f88b89e37f59d85176423f77dfc59a448995dd99d5c29bb0738ab"} Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.914598 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9js7w" event={"ID":"0ecabc57-5f50-49ef-8d1b-c8daa268642f","Type":"ContainerStarted","Data":"44dddc0587ef9ccc05e7966f7571bc7e084cf397cfab99e9458e591799c0f65b"} Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.916584 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.916632 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0f073022-a55b-4a76-8fbd-92df61f2d38b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.916815 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khw8b\" (UniqueName: \"kubernetes.io/projected/0f073022-a55b-4a76-8fbd-92df61f2d38b-kube-api-access-khw8b\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.916841 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.916866 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.916895 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.916916 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.916934 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.916952 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0f073022-a55b-4a76-8fbd-92df61f2d38b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.916971 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.916998 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.918491 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.919070 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.922312 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0f073022-a55b-4a76-8fbd-92df61f2d38b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.924507 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.941490 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.941732 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.942375 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.945411 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.950177 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0f073022-a55b-4a76-8fbd-92df61f2d38b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:58 crc kubenswrapper[4693]: I1204 10:01:58.958822 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:59 crc kubenswrapper[4693]: I1204 10:01:59.012093 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:59 crc kubenswrapper[4693]: I1204 10:01:59.021471 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khw8b\" (UniqueName: \"kubernetes.io/projected/0f073022-a55b-4a76-8fbd-92df61f2d38b-kube-api-access-khw8b\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:01:59 crc kubenswrapper[4693]: I1204 10:01:59.310131 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.140070 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.197478 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.345909 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.347079 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.356201 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.356413 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.356446 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-ljlxx" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.360660 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.369038 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.369847 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.473229 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e55c8437-1394-45c9-b135-2dbe68895d38-kolla-config\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.473282 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e55c8437-1394-45c9-b135-2dbe68895d38-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.473308 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e55c8437-1394-45c9-b135-2dbe68895d38-config-data-default\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.473418 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bt65\" (UniqueName: \"kubernetes.io/projected/e55c8437-1394-45c9-b135-2dbe68895d38-kube-api-access-8bt65\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.473466 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e55c8437-1394-45c9-b135-2dbe68895d38-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.473490 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e55c8437-1394-45c9-b135-2dbe68895d38-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.473508 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e55c8437-1394-45c9-b135-2dbe68895d38-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.473528 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.574482 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bt65\" (UniqueName: \"kubernetes.io/projected/e55c8437-1394-45c9-b135-2dbe68895d38-kube-api-access-8bt65\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.576041 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e55c8437-1394-45c9-b135-2dbe68895d38-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.577030 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e55c8437-1394-45c9-b135-2dbe68895d38-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.577073 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e55c8437-1394-45c9-b135-2dbe68895d38-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.577103 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.577192 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e55c8437-1394-45c9-b135-2dbe68895d38-kolla-config\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.577219 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e55c8437-1394-45c9-b135-2dbe68895d38-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.577269 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e55c8437-1394-45c9-b135-2dbe68895d38-config-data-default\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.577946 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.578294 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e55c8437-1394-45c9-b135-2dbe68895d38-config-data-default\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.578935 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e55c8437-1394-45c9-b135-2dbe68895d38-kolla-config\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.579184 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e55c8437-1394-45c9-b135-2dbe68895d38-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.583611 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e55c8437-1394-45c9-b135-2dbe68895d38-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.592909 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e55c8437-1394-45c9-b135-2dbe68895d38-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.597882 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bt65\" (UniqueName: \"kubernetes.io/projected/e55c8437-1394-45c9-b135-2dbe68895d38-kube-api-access-8bt65\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.599045 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e55c8437-1394-45c9-b135-2dbe68895d38-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.601858 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"e55c8437-1394-45c9-b135-2dbe68895d38\") " pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.666263 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.970056 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2d1a11f6-b003-41f8-a2f1-010d7dae29d4","Type":"ContainerStarted","Data":"f555143c410f65fd65e90fe83fee572868963d76e090a838126f879bd1cbe4b9"} Dec 04 10:02:00 crc kubenswrapper[4693]: I1204 10:02:00.975304 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0f073022-a55b-4a76-8fbd-92df61f2d38b","Type":"ContainerStarted","Data":"b9dbe041a9a892af100aa80824488278db9f03cd649ac04fb30c2c5009097bc4"} Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.135182 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 10:02:01 crc kubenswrapper[4693]: W1204 10:02:01.149546 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode55c8437_1394_45c9_b135_2dbe68895d38.slice/crio-2ff912605622e3af374b7c4c553bb94e03b5842013eeb28e67944296a53d2e56 WatchSource:0}: Error finding container 2ff912605622e3af374b7c4c553bb94e03b5842013eeb28e67944296a53d2e56: Status 404 returned error can't find the container with id 2ff912605622e3af374b7c4c553bb94e03b5842013eeb28e67944296a53d2e56 Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.455006 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.456300 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.463608 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-cnpvw" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.463846 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.464186 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.463743 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.470581 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.607186 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b85d3f3e-5811-4829-8b36-96ecb7f22492-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.607670 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b85d3f3e-5811-4829-8b36-96ecb7f22492-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.607693 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b85d3f3e-5811-4829-8b36-96ecb7f22492-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.607708 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b85d3f3e-5811-4829-8b36-96ecb7f22492-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.607736 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pgst\" (UniqueName: \"kubernetes.io/projected/b85d3f3e-5811-4829-8b36-96ecb7f22492-kube-api-access-5pgst\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.610498 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.610965 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b85d3f3e-5811-4829-8b36-96ecb7f22492-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.611130 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b85d3f3e-5811-4829-8b36-96ecb7f22492-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.708238 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.709589 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.712862 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.712910 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b85d3f3e-5811-4829-8b36-96ecb7f22492-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.712970 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b85d3f3e-5811-4829-8b36-96ecb7f22492-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.713017 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b85d3f3e-5811-4829-8b36-96ecb7f22492-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.713038 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b85d3f3e-5811-4829-8b36-96ecb7f22492-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.713061 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b85d3f3e-5811-4829-8b36-96ecb7f22492-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.713077 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b85d3f3e-5811-4829-8b36-96ecb7f22492-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.713104 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pgst\" (UniqueName: \"kubernetes.io/projected/b85d3f3e-5811-4829-8b36-96ecb7f22492-kube-api-access-5pgst\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.713543 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.713955 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.714783 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.715156 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b85d3f3e-5811-4829-8b36-96ecb7f22492-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.716265 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b85d3f3e-5811-4829-8b36-96ecb7f22492-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.716557 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b85d3f3e-5811-4829-8b36-96ecb7f22492-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.717097 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b85d3f3e-5811-4829-8b36-96ecb7f22492-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.720963 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-nvmk4" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.727059 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.736537 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pgst\" (UniqueName: \"kubernetes.io/projected/b85d3f3e-5811-4829-8b36-96ecb7f22492-kube-api-access-5pgst\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.750030 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b85d3f3e-5811-4829-8b36-96ecb7f22492-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.750350 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b85d3f3e-5811-4829-8b36-96ecb7f22492-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.782179 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b85d3f3e-5811-4829-8b36-96ecb7f22492\") " pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.814548 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f304446b-e129-40a5-bc56-a79d0b973f0a-kolla-config\") pod \"memcached-0\" (UID: \"f304446b-e129-40a5-bc56-a79d0b973f0a\") " pod="openstack/memcached-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.814613 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f304446b-e129-40a5-bc56-a79d0b973f0a-config-data\") pod \"memcached-0\" (UID: \"f304446b-e129-40a5-bc56-a79d0b973f0a\") " pod="openstack/memcached-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.814631 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjgsp\" (UniqueName: \"kubernetes.io/projected/f304446b-e129-40a5-bc56-a79d0b973f0a-kube-api-access-gjgsp\") pod \"memcached-0\" (UID: \"f304446b-e129-40a5-bc56-a79d0b973f0a\") " pod="openstack/memcached-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.814690 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f304446b-e129-40a5-bc56-a79d0b973f0a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f304446b-e129-40a5-bc56-a79d0b973f0a\") " pod="openstack/memcached-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.814713 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f304446b-e129-40a5-bc56-a79d0b973f0a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f304446b-e129-40a5-bc56-a79d0b973f0a\") " pod="openstack/memcached-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.916173 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f304446b-e129-40a5-bc56-a79d0b973f0a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f304446b-e129-40a5-bc56-a79d0b973f0a\") " pod="openstack/memcached-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.916257 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f304446b-e129-40a5-bc56-a79d0b973f0a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f304446b-e129-40a5-bc56-a79d0b973f0a\") " pod="openstack/memcached-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.916404 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f304446b-e129-40a5-bc56-a79d0b973f0a-kolla-config\") pod \"memcached-0\" (UID: \"f304446b-e129-40a5-bc56-a79d0b973f0a\") " pod="openstack/memcached-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.916456 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f304446b-e129-40a5-bc56-a79d0b973f0a-config-data\") pod \"memcached-0\" (UID: \"f304446b-e129-40a5-bc56-a79d0b973f0a\") " pod="openstack/memcached-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.916482 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjgsp\" (UniqueName: \"kubernetes.io/projected/f304446b-e129-40a5-bc56-a79d0b973f0a-kube-api-access-gjgsp\") pod \"memcached-0\" (UID: \"f304446b-e129-40a5-bc56-a79d0b973f0a\") " pod="openstack/memcached-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.917724 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f304446b-e129-40a5-bc56-a79d0b973f0a-config-data\") pod \"memcached-0\" (UID: \"f304446b-e129-40a5-bc56-a79d0b973f0a\") " pod="openstack/memcached-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.918106 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f304446b-e129-40a5-bc56-a79d0b973f0a-kolla-config\") pod \"memcached-0\" (UID: \"f304446b-e129-40a5-bc56-a79d0b973f0a\") " pod="openstack/memcached-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.927211 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f304446b-e129-40a5-bc56-a79d0b973f0a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f304446b-e129-40a5-bc56-a79d0b973f0a\") " pod="openstack/memcached-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.927285 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f304446b-e129-40a5-bc56-a79d0b973f0a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f304446b-e129-40a5-bc56-a79d0b973f0a\") " pod="openstack/memcached-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.944984 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjgsp\" (UniqueName: \"kubernetes.io/projected/f304446b-e129-40a5-bc56-a79d0b973f0a-kube-api-access-gjgsp\") pod \"memcached-0\" (UID: \"f304446b-e129-40a5-bc56-a79d0b973f0a\") " pod="openstack/memcached-0" Dec 04 10:02:01 crc kubenswrapper[4693]: I1204 10:02:01.992226 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e55c8437-1394-45c9-b135-2dbe68895d38","Type":"ContainerStarted","Data":"2ff912605622e3af374b7c4c553bb94e03b5842013eeb28e67944296a53d2e56"} Dec 04 10:02:02 crc kubenswrapper[4693]: I1204 10:02:02.076294 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 10:02:02 crc kubenswrapper[4693]: I1204 10:02:02.101099 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 10:02:02 crc kubenswrapper[4693]: I1204 10:02:02.419928 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 04 10:02:02 crc kubenswrapper[4693]: I1204 10:02:02.753068 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 10:02:02 crc kubenswrapper[4693]: W1204 10:02:02.773706 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb85d3f3e_5811_4829_8b36_96ecb7f22492.slice/crio-ef6d34652daf286f0e70ed3c7109f3fcb5b6bc0753328f56011023c19e23ae6d WatchSource:0}: Error finding container ef6d34652daf286f0e70ed3c7109f3fcb5b6bc0753328f56011023c19e23ae6d: Status 404 returned error can't find the container with id ef6d34652daf286f0e70ed3c7109f3fcb5b6bc0753328f56011023c19e23ae6d Dec 04 10:02:03 crc kubenswrapper[4693]: I1204 10:02:03.006484 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f304446b-e129-40a5-bc56-a79d0b973f0a","Type":"ContainerStarted","Data":"4e4010c124f0da347c02a44ff73487a2f02a0546934de4dd469ec7fa6e067a81"} Dec 04 10:02:03 crc kubenswrapper[4693]: I1204 10:02:03.013746 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b85d3f3e-5811-4829-8b36-96ecb7f22492","Type":"ContainerStarted","Data":"ef6d34652daf286f0e70ed3c7109f3fcb5b6bc0753328f56011023c19e23ae6d"} Dec 04 10:02:03 crc kubenswrapper[4693]: I1204 10:02:03.889121 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:02:03 crc kubenswrapper[4693]: I1204 10:02:03.890465 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 10:02:03 crc kubenswrapper[4693]: I1204 10:02:03.897826 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:02:03 crc kubenswrapper[4693]: I1204 10:02:03.937005 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-s2k8w" Dec 04 10:02:04 crc kubenswrapper[4693]: I1204 10:02:04.073947 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsvpw\" (UniqueName: \"kubernetes.io/projected/2b831c57-110e-406f-b9a9-3c619add6639-kube-api-access-vsvpw\") pod \"kube-state-metrics-0\" (UID: \"2b831c57-110e-406f-b9a9-3c619add6639\") " pod="openstack/kube-state-metrics-0" Dec 04 10:02:04 crc kubenswrapper[4693]: I1204 10:02:04.175784 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsvpw\" (UniqueName: \"kubernetes.io/projected/2b831c57-110e-406f-b9a9-3c619add6639-kube-api-access-vsvpw\") pod \"kube-state-metrics-0\" (UID: \"2b831c57-110e-406f-b9a9-3c619add6639\") " pod="openstack/kube-state-metrics-0" Dec 04 10:02:04 crc kubenswrapper[4693]: I1204 10:02:04.215044 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsvpw\" (UniqueName: \"kubernetes.io/projected/2b831c57-110e-406f-b9a9-3c619add6639-kube-api-access-vsvpw\") pod \"kube-state-metrics-0\" (UID: \"2b831c57-110e-406f-b9a9-3c619add6639\") " pod="openstack/kube-state-metrics-0" Dec 04 10:02:04 crc kubenswrapper[4693]: I1204 10:02:04.249649 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 10:02:04 crc kubenswrapper[4693]: I1204 10:02:04.794285 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:02:04 crc kubenswrapper[4693]: W1204 10:02:04.801765 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b831c57_110e_406f_b9a9_3c619add6639.slice/crio-1a98bd828ff539b9c30bfede6829c87d64dc982eb84378c6b50dd439c9c5a330 WatchSource:0}: Error finding container 1a98bd828ff539b9c30bfede6829c87d64dc982eb84378c6b50dd439c9c5a330: Status 404 returned error can't find the container with id 1a98bd828ff539b9c30bfede6829c87d64dc982eb84378c6b50dd439c9c5a330 Dec 04 10:02:05 crc kubenswrapper[4693]: I1204 10:02:05.033798 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2b831c57-110e-406f-b9a9-3c619add6639","Type":"ContainerStarted","Data":"1a98bd828ff539b9c30bfede6829c87d64dc982eb84378c6b50dd439c9c5a330"} Dec 04 10:02:07 crc kubenswrapper[4693]: I1204 10:02:07.939538 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zczb4"] Dec 04 10:02:07 crc kubenswrapper[4693]: I1204 10:02:07.941626 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zczb4" Dec 04 10:02:07 crc kubenswrapper[4693]: I1204 10:02:07.949157 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-sg2sq" Dec 04 10:02:07 crc kubenswrapper[4693]: I1204 10:02:07.949674 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 04 10:02:07 crc kubenswrapper[4693]: I1204 10:02:07.949890 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 04 10:02:07 crc kubenswrapper[4693]: I1204 10:02:07.955408 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-7nh5c"] Dec 04 10:02:07 crc kubenswrapper[4693]: I1204 10:02:07.973010 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zczb4"] Dec 04 10:02:07 crc kubenswrapper[4693]: I1204 10:02:07.974411 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:07 crc kubenswrapper[4693]: I1204 10:02:07.989485 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7nh5c"] Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.067208 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkczg\" (UniqueName: \"kubernetes.io/projected/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-kube-api-access-kkczg\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.067254 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-ovn-controller-tls-certs\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.067276 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-scripts\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.067308 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cd544a1e-c7e1-4f04-90f5-d9cf152c4f12-var-lib\") pod \"ovn-controller-ovs-7nh5c\" (UID: \"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12\") " pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.067325 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-combined-ca-bundle\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.067483 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-var-run\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.067515 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cd544a1e-c7e1-4f04-90f5-d9cf152c4f12-var-run\") pod \"ovn-controller-ovs-7nh5c\" (UID: \"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12\") " pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.067534 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd544a1e-c7e1-4f04-90f5-d9cf152c4f12-scripts\") pod \"ovn-controller-ovs-7nh5c\" (UID: \"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12\") " pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.069034 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-var-log-ovn\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.069106 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cd544a1e-c7e1-4f04-90f5-d9cf152c4f12-var-log\") pod \"ovn-controller-ovs-7nh5c\" (UID: \"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12\") " pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.069154 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46msc\" (UniqueName: \"kubernetes.io/projected/cd544a1e-c7e1-4f04-90f5-d9cf152c4f12-kube-api-access-46msc\") pod \"ovn-controller-ovs-7nh5c\" (UID: \"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12\") " pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.069214 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-var-run-ovn\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.069245 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cd544a1e-c7e1-4f04-90f5-d9cf152c4f12-etc-ovs\") pod \"ovn-controller-ovs-7nh5c\" (UID: \"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12\") " pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.170929 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-ovn-controller-tls-certs\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.171091 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-scripts\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.171144 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cd544a1e-c7e1-4f04-90f5-d9cf152c4f12-var-lib\") pod \"ovn-controller-ovs-7nh5c\" (UID: \"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12\") " pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.171162 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-combined-ca-bundle\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.171189 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cd544a1e-c7e1-4f04-90f5-d9cf152c4f12-var-run\") pod \"ovn-controller-ovs-7nh5c\" (UID: \"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12\") " pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.171202 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd544a1e-c7e1-4f04-90f5-d9cf152c4f12-scripts\") pod \"ovn-controller-ovs-7nh5c\" (UID: \"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12\") " pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.171215 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-var-run\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.171246 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-var-log-ovn\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.171271 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cd544a1e-c7e1-4f04-90f5-d9cf152c4f12-var-log\") pod \"ovn-controller-ovs-7nh5c\" (UID: \"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12\") " pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.171320 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46msc\" (UniqueName: \"kubernetes.io/projected/cd544a1e-c7e1-4f04-90f5-d9cf152c4f12-kube-api-access-46msc\") pod \"ovn-controller-ovs-7nh5c\" (UID: \"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12\") " pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.171366 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-var-run-ovn\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.171386 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cd544a1e-c7e1-4f04-90f5-d9cf152c4f12-etc-ovs\") pod \"ovn-controller-ovs-7nh5c\" (UID: \"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12\") " pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.171422 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkczg\" (UniqueName: \"kubernetes.io/projected/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-kube-api-access-kkczg\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.172239 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-var-run-ovn\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.172364 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cd544a1e-c7e1-4f04-90f5-d9cf152c4f12-var-run\") pod \"ovn-controller-ovs-7nh5c\" (UID: \"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12\") " pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.172372 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cd544a1e-c7e1-4f04-90f5-d9cf152c4f12-var-lib\") pod \"ovn-controller-ovs-7nh5c\" (UID: \"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12\") " pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.172427 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cd544a1e-c7e1-4f04-90f5-d9cf152c4f12-etc-ovs\") pod \"ovn-controller-ovs-7nh5c\" (UID: \"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12\") " pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.174223 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd544a1e-c7e1-4f04-90f5-d9cf152c4f12-scripts\") pod \"ovn-controller-ovs-7nh5c\" (UID: \"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12\") " pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.176633 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-scripts\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.178427 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-var-run\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.178449 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cd544a1e-c7e1-4f04-90f5-d9cf152c4f12-var-log\") pod \"ovn-controller-ovs-7nh5c\" (UID: \"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12\") " pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.179019 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-var-log-ovn\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.187300 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-combined-ca-bundle\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.188548 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-ovn-controller-tls-certs\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.196045 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46msc\" (UniqueName: \"kubernetes.io/projected/cd544a1e-c7e1-4f04-90f5-d9cf152c4f12-kube-api-access-46msc\") pod \"ovn-controller-ovs-7nh5c\" (UID: \"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12\") " pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.200633 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkczg\" (UniqueName: \"kubernetes.io/projected/ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a-kube-api-access-kkczg\") pod \"ovn-controller-zczb4\" (UID: \"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a\") " pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.310027 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zczb4" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.332765 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.412018 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.413426 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.419190 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.422278 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.422529 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.422684 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.425205 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-65c67" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.425465 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.580909 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bbqx\" (UniqueName: \"kubernetes.io/projected/14639c36-341c-4f90-980b-b9fffce3c8f8-kube-api-access-9bbqx\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.580958 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14639c36-341c-4f90-980b-b9fffce3c8f8-config\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.581016 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/14639c36-341c-4f90-980b-b9fffce3c8f8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.581033 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.581054 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/14639c36-341c-4f90-980b-b9fffce3c8f8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.581069 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14639c36-341c-4f90-980b-b9fffce3c8f8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.581187 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14639c36-341c-4f90-980b-b9fffce3c8f8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.581964 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/14639c36-341c-4f90-980b-b9fffce3c8f8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.683129 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/14639c36-341c-4f90-980b-b9fffce3c8f8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.683204 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bbqx\" (UniqueName: \"kubernetes.io/projected/14639c36-341c-4f90-980b-b9fffce3c8f8-kube-api-access-9bbqx\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.683226 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14639c36-341c-4f90-980b-b9fffce3c8f8-config\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.683242 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/14639c36-341c-4f90-980b-b9fffce3c8f8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.683257 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.683272 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/14639c36-341c-4f90-980b-b9fffce3c8f8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.683289 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14639c36-341c-4f90-980b-b9fffce3c8f8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.683319 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14639c36-341c-4f90-980b-b9fffce3c8f8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.684406 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14639c36-341c-4f90-980b-b9fffce3c8f8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.684676 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.701217 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/14639c36-341c-4f90-980b-b9fffce3c8f8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.702317 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14639c36-341c-4f90-980b-b9fffce3c8f8-config\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.704861 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/14639c36-341c-4f90-980b-b9fffce3c8f8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.705054 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14639c36-341c-4f90-980b-b9fffce3c8f8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.724400 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bbqx\" (UniqueName: \"kubernetes.io/projected/14639c36-341c-4f90-980b-b9fffce3c8f8-kube-api-access-9bbqx\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.724613 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/14639c36-341c-4f90-980b-b9fffce3c8f8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:08 crc kubenswrapper[4693]: I1204 10:02:08.740306 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"14639c36-341c-4f90-980b-b9fffce3c8f8\") " pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:09 crc kubenswrapper[4693]: I1204 10:02:09.034412 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 10:02:11 crc kubenswrapper[4693]: I1204 10:02:11.872565 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 10:02:11 crc kubenswrapper[4693]: I1204 10:02:11.874582 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:11 crc kubenswrapper[4693]: I1204 10:02:11.877721 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 04 10:02:11 crc kubenswrapper[4693]: I1204 10:02:11.878009 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-8kpfw" Dec 04 10:02:11 crc kubenswrapper[4693]: I1204 10:02:11.878418 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 04 10:02:11 crc kubenswrapper[4693]: I1204 10:02:11.878844 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 04 10:02:11 crc kubenswrapper[4693]: I1204 10:02:11.884173 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.068812 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.068917 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9999b422-d127-4990-8091-9446e589839a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.068968 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9999b422-d127-4990-8091-9446e589839a-config\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.069005 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9999b422-d127-4990-8091-9446e589839a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.069120 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9999b422-d127-4990-8091-9446e589839a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.069286 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9999b422-d127-4990-8091-9446e589839a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.069319 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46k2r\" (UniqueName: \"kubernetes.io/projected/9999b422-d127-4990-8091-9446e589839a-kube-api-access-46k2r\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.069377 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9999b422-d127-4990-8091-9446e589839a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.171889 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.171987 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9999b422-d127-4990-8091-9446e589839a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.172040 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9999b422-d127-4990-8091-9446e589839a-config\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.172079 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9999b422-d127-4990-8091-9446e589839a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.172098 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9999b422-d127-4990-8091-9446e589839a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.172132 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46k2r\" (UniqueName: \"kubernetes.io/projected/9999b422-d127-4990-8091-9446e589839a-kube-api-access-46k2r\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.172154 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9999b422-d127-4990-8091-9446e589839a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.172180 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9999b422-d127-4990-8091-9446e589839a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.172325 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.173108 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9999b422-d127-4990-8091-9446e589839a-config\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.173585 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9999b422-d127-4990-8091-9446e589839a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.173955 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9999b422-d127-4990-8091-9446e589839a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.179011 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9999b422-d127-4990-8091-9446e589839a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.179039 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9999b422-d127-4990-8091-9446e589839a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.179028 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9999b422-d127-4990-8091-9446e589839a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.191580 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46k2r\" (UniqueName: \"kubernetes.io/projected/9999b422-d127-4990-8091-9446e589839a-kube-api-access-46k2r\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.193761 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9999b422-d127-4990-8091-9446e589839a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:12 crc kubenswrapper[4693]: I1204 10:02:12.207128 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 10:02:37 crc kubenswrapper[4693]: E1204 10:02:37.422634 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 04 10:02:37 crc kubenswrapper[4693]: E1204 10:02:37.423370 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-khw8b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(0f073022-a55b-4a76-8fbd-92df61f2d38b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:02:37 crc kubenswrapper[4693]: E1204 10:02:37.424631 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="0f073022-a55b-4a76-8fbd-92df61f2d38b" Dec 04 10:02:38 crc kubenswrapper[4693]: E1204 10:02:38.317908 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="0f073022-a55b-4a76-8fbd-92df61f2d38b" Dec 04 10:02:39 crc kubenswrapper[4693]: E1204 10:02:39.501188 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 04 10:02:39 crc kubenswrapper[4693]: E1204 10:02:39.502263 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6p6pn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(2d1a11f6-b003-41f8-a2f1-010d7dae29d4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:02:39 crc kubenswrapper[4693]: E1204 10:02:39.509576 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="2d1a11f6-b003-41f8-a2f1-010d7dae29d4" Dec 04 10:02:39 crc kubenswrapper[4693]: E1204 10:02:39.510571 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 04 10:02:39 crc kubenswrapper[4693]: E1204 10:02:39.510716 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8bt65,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(e55c8437-1394-45c9-b135-2dbe68895d38): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:02:39 crc kubenswrapper[4693]: E1204 10:02:39.511856 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="e55c8437-1394-45c9-b135-2dbe68895d38" Dec 04 10:02:40 crc kubenswrapper[4693]: E1204 10:02:40.328157 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="e55c8437-1394-45c9-b135-2dbe68895d38" Dec 04 10:02:40 crc kubenswrapper[4693]: E1204 10:02:40.328415 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="2d1a11f6-b003-41f8-a2f1-010d7dae29d4" Dec 04 10:02:45 crc kubenswrapper[4693]: E1204 10:02:45.664975 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 04 10:02:45 crc kubenswrapper[4693]: E1204 10:02:45.665567 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jmc6g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-k7gd8_openstack(ed22773d-6d16-48d9-87a7-b5b7aac712fd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:02:45 crc kubenswrapper[4693]: E1204 10:02:45.666754 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-k7gd8" podUID="ed22773d-6d16-48d9-87a7-b5b7aac712fd" Dec 04 10:02:45 crc kubenswrapper[4693]: E1204 10:02:45.700038 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 04 10:02:45 crc kubenswrapper[4693]: E1204 10:02:45.700200 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zpx77,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-98nc2_openstack(629d83c6-3184-44ea-a2c5-5335c4acc9a3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:02:45 crc kubenswrapper[4693]: E1204 10:02:45.701486 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-98nc2" podUID="629d83c6-3184-44ea-a2c5-5335c4acc9a3" Dec 04 10:02:45 crc kubenswrapper[4693]: E1204 10:02:45.712298 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 04 10:02:45 crc kubenswrapper[4693]: E1204 10:02:45.712474 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wlds9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-9js7w_openstack(0ecabc57-5f50-49ef-8d1b-c8daa268642f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:02:45 crc kubenswrapper[4693]: E1204 10:02:45.713636 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-9js7w" podUID="0ecabc57-5f50-49ef-8d1b-c8daa268642f" Dec 04 10:02:45 crc kubenswrapper[4693]: E1204 10:02:45.746933 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 04 10:02:45 crc kubenswrapper[4693]: E1204 10:02:45.747124 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vk9zl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-jl2ft_openstack(3e20e730-0b42-4aad-bcb0-6f0b77c04c3a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:02:45 crc kubenswrapper[4693]: E1204 10:02:45.751024 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-jl2ft" podUID="3e20e730-0b42-4aad-bcb0-6f0b77c04c3a" Dec 04 10:02:46 crc kubenswrapper[4693]: I1204 10:02:46.214202 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7nh5c"] Dec 04 10:02:46 crc kubenswrapper[4693]: I1204 10:02:46.248859 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zczb4"] Dec 04 10:02:46 crc kubenswrapper[4693]: I1204 10:02:46.296194 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 10:02:46 crc kubenswrapper[4693]: E1204 10:02:46.369285 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-9js7w" podUID="0ecabc57-5f50-49ef-8d1b-c8daa268642f" Dec 04 10:02:46 crc kubenswrapper[4693]: E1204 10:02:46.369691 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-jl2ft" podUID="3e20e730-0b42-4aad-bcb0-6f0b77c04c3a" Dec 04 10:02:46 crc kubenswrapper[4693]: I1204 10:02:46.401488 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 10:02:47 crc kubenswrapper[4693]: W1204 10:02:47.080799 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14639c36_341c_4f90_980b_b9fffce3c8f8.slice/crio-4f653d09a2f4d94c47b29f6a30335abbfca162659825f885e0fbeca3d79994a4 WatchSource:0}: Error finding container 4f653d09a2f4d94c47b29f6a30335abbfca162659825f885e0fbeca3d79994a4: Status 404 returned error can't find the container with id 4f653d09a2f4d94c47b29f6a30335abbfca162659825f885e0fbeca3d79994a4 Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.147567 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-98nc2" Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.158788 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-k7gd8" Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.207691 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpx77\" (UniqueName: \"kubernetes.io/projected/629d83c6-3184-44ea-a2c5-5335c4acc9a3-kube-api-access-zpx77\") pod \"629d83c6-3184-44ea-a2c5-5335c4acc9a3\" (UID: \"629d83c6-3184-44ea-a2c5-5335c4acc9a3\") " Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.208305 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmc6g\" (UniqueName: \"kubernetes.io/projected/ed22773d-6d16-48d9-87a7-b5b7aac712fd-kube-api-access-jmc6g\") pod \"ed22773d-6d16-48d9-87a7-b5b7aac712fd\" (UID: \"ed22773d-6d16-48d9-87a7-b5b7aac712fd\") " Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.208378 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed22773d-6d16-48d9-87a7-b5b7aac712fd-dns-svc\") pod \"ed22773d-6d16-48d9-87a7-b5b7aac712fd\" (UID: \"ed22773d-6d16-48d9-87a7-b5b7aac712fd\") " Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.208425 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed22773d-6d16-48d9-87a7-b5b7aac712fd-config\") pod \"ed22773d-6d16-48d9-87a7-b5b7aac712fd\" (UID: \"ed22773d-6d16-48d9-87a7-b5b7aac712fd\") " Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.208460 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/629d83c6-3184-44ea-a2c5-5335c4acc9a3-config\") pod \"629d83c6-3184-44ea-a2c5-5335c4acc9a3\" (UID: \"629d83c6-3184-44ea-a2c5-5335c4acc9a3\") " Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.209247 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/629d83c6-3184-44ea-a2c5-5335c4acc9a3-config" (OuterVolumeSpecName: "config") pod "629d83c6-3184-44ea-a2c5-5335c4acc9a3" (UID: "629d83c6-3184-44ea-a2c5-5335c4acc9a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.209665 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/629d83c6-3184-44ea-a2c5-5335c4acc9a3-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.210378 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed22773d-6d16-48d9-87a7-b5b7aac712fd-config" (OuterVolumeSpecName: "config") pod "ed22773d-6d16-48d9-87a7-b5b7aac712fd" (UID: "ed22773d-6d16-48d9-87a7-b5b7aac712fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.211302 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed22773d-6d16-48d9-87a7-b5b7aac712fd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ed22773d-6d16-48d9-87a7-b5b7aac712fd" (UID: "ed22773d-6d16-48d9-87a7-b5b7aac712fd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.214017 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed22773d-6d16-48d9-87a7-b5b7aac712fd-kube-api-access-jmc6g" (OuterVolumeSpecName: "kube-api-access-jmc6g") pod "ed22773d-6d16-48d9-87a7-b5b7aac712fd" (UID: "ed22773d-6d16-48d9-87a7-b5b7aac712fd"). InnerVolumeSpecName "kube-api-access-jmc6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.218199 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/629d83c6-3184-44ea-a2c5-5335c4acc9a3-kube-api-access-zpx77" (OuterVolumeSpecName: "kube-api-access-zpx77") pod "629d83c6-3184-44ea-a2c5-5335c4acc9a3" (UID: "629d83c6-3184-44ea-a2c5-5335c4acc9a3"). InnerVolumeSpecName "kube-api-access-zpx77". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.311630 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpx77\" (UniqueName: \"kubernetes.io/projected/629d83c6-3184-44ea-a2c5-5335c4acc9a3-kube-api-access-zpx77\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.311706 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmc6g\" (UniqueName: \"kubernetes.io/projected/ed22773d-6d16-48d9-87a7-b5b7aac712fd-kube-api-access-jmc6g\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.311721 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed22773d-6d16-48d9-87a7-b5b7aac712fd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.311729 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed22773d-6d16-48d9-87a7-b5b7aac712fd-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.378420 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-98nc2" Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.378404 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-98nc2" event={"ID":"629d83c6-3184-44ea-a2c5-5335c4acc9a3","Type":"ContainerDied","Data":"1f98e13bafd57b0c49b159a62adfc590cb5481848cc7d77b6d9224ffd643177b"} Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.379581 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7nh5c" event={"ID":"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12","Type":"ContainerStarted","Data":"45eae3de731fd947cc837528174f2ca0d965d9287709d418c1c8f2becbfe2ff1"} Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.382635 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9999b422-d127-4990-8091-9446e589839a","Type":"ContainerStarted","Data":"a2e731e197d45efea792945eb2c6542ea9daf1e7d8e8fc74cadb19111748722d"} Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.384090 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"14639c36-341c-4f90-980b-b9fffce3c8f8","Type":"ContainerStarted","Data":"4f653d09a2f4d94c47b29f6a30335abbfca162659825f885e0fbeca3d79994a4"} Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.385238 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-k7gd8" event={"ID":"ed22773d-6d16-48d9-87a7-b5b7aac712fd","Type":"ContainerDied","Data":"61d813c312f8142b790c59cc62e85601655cf976b642f8532d6369cabbe70cc3"} Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.385297 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-k7gd8" Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.387700 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zczb4" event={"ID":"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a","Type":"ContainerStarted","Data":"64eba0c914ddda6ad2de95b77ca6f79300994c3e3f90169e78869e6bd7d48b38"} Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.436117 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-98nc2"] Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.450465 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-98nc2"] Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.465921 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-k7gd8"] Dec 04 10:02:47 crc kubenswrapper[4693]: I1204 10:02:47.472051 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-k7gd8"] Dec 04 10:02:48 crc kubenswrapper[4693]: I1204 10:02:48.396400 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b85d3f3e-5811-4829-8b36-96ecb7f22492","Type":"ContainerStarted","Data":"45043947bcba0bebdc06a917604b39b01ca1ced729ec566ca212700e41000845"} Dec 04 10:02:48 crc kubenswrapper[4693]: I1204 10:02:48.473056 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="629d83c6-3184-44ea-a2c5-5335c4acc9a3" path="/var/lib/kubelet/pods/629d83c6-3184-44ea-a2c5-5335c4acc9a3/volumes" Dec 04 10:02:48 crc kubenswrapper[4693]: I1204 10:02:48.473457 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed22773d-6d16-48d9-87a7-b5b7aac712fd" path="/var/lib/kubelet/pods/ed22773d-6d16-48d9-87a7-b5b7aac712fd/volumes" Dec 04 10:02:52 crc kubenswrapper[4693]: I1204 10:02:52.273545 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:02:52 crc kubenswrapper[4693]: I1204 10:02:52.273908 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:02:55 crc kubenswrapper[4693]: I1204 10:02:55.451304 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2b831c57-110e-406f-b9a9-3c619add6639","Type":"ContainerStarted","Data":"657412118b9663aab5061cb1b3c5230f02ee3f5574496056f0ac2aa33cc465b1"} Dec 04 10:02:55 crc kubenswrapper[4693]: I1204 10:02:55.451923 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 04 10:02:55 crc kubenswrapper[4693]: I1204 10:02:55.453284 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9999b422-d127-4990-8091-9446e589839a","Type":"ContainerStarted","Data":"2144fae9599032321ca352c69eea8d941c2db557e3176a5b81d4b97e2ae18a13"} Dec 04 10:02:55 crc kubenswrapper[4693]: I1204 10:02:55.455065 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f304446b-e129-40a5-bc56-a79d0b973f0a","Type":"ContainerStarted","Data":"9613247ee956e1e1dcf77c9f68d1b20953a8be816adcb7b1aa61b478cb995449"} Dec 04 10:02:55 crc kubenswrapper[4693]: I1204 10:02:55.455201 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 04 10:02:55 crc kubenswrapper[4693]: I1204 10:02:55.456602 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"14639c36-341c-4f90-980b-b9fffce3c8f8","Type":"ContainerStarted","Data":"47ae3a2841aeced425cdc3a40fc5e866846098ed7de0406bc0c3b3d7a9951895"} Dec 04 10:02:55 crc kubenswrapper[4693]: I1204 10:02:55.457668 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e55c8437-1394-45c9-b135-2dbe68895d38","Type":"ContainerStarted","Data":"dd1f6876e20c9783b69c77e74f7e360d48dc2a93e85260774dc928fc59ecfcea"} Dec 04 10:02:55 crc kubenswrapper[4693]: I1204 10:02:55.476073 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.2564027429999998 podStartE2EDuration="52.476055653s" podCreationTimestamp="2025-12-04 10:02:03 +0000 UTC" firstStartedPulling="2025-12-04 10:02:04.8032847 +0000 UTC m=+1170.700878453" lastFinishedPulling="2025-12-04 10:02:55.02293761 +0000 UTC m=+1220.920531363" observedRunningTime="2025-12-04 10:02:55.474143661 +0000 UTC m=+1221.371737414" watchObservedRunningTime="2025-12-04 10:02:55.476055653 +0000 UTC m=+1221.373649406" Dec 04 10:02:55 crc kubenswrapper[4693]: I1204 10:02:55.496101 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=10.97325839 podStartE2EDuration="54.496083976s" podCreationTimestamp="2025-12-04 10:02:01 +0000 UTC" firstStartedPulling="2025-12-04 10:02:02.444189605 +0000 UTC m=+1168.341783358" lastFinishedPulling="2025-12-04 10:02:45.967015201 +0000 UTC m=+1211.864608944" observedRunningTime="2025-12-04 10:02:55.494673217 +0000 UTC m=+1221.392266980" watchObservedRunningTime="2025-12-04 10:02:55.496083976 +0000 UTC m=+1221.393677719" Dec 04 10:02:56 crc kubenswrapper[4693]: I1204 10:02:56.473325 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7nh5c" event={"ID":"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12","Type":"ContainerStarted","Data":"dea3fa0c5eb5aa3eeddb3eca957bd14e997be8abb99f165bbff391e98b0e9e40"} Dec 04 10:02:56 crc kubenswrapper[4693]: I1204 10:02:56.473710 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zczb4" event={"ID":"ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a","Type":"ContainerStarted","Data":"e18409c6559e706b2c6d80be2e209d01a5b06325be185fadeace4da4e03c1951"} Dec 04 10:02:58 crc kubenswrapper[4693]: I1204 10:02:58.493998 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2d1a11f6-b003-41f8-a2f1-010d7dae29d4","Type":"ContainerStarted","Data":"1b1cc8585e13a887662f01305afa00ea83a4300fcc52b4fcfcf4fc153d0a7e77"} Dec 04 10:02:58 crc kubenswrapper[4693]: I1204 10:02:58.496471 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0f073022-a55b-4a76-8fbd-92df61f2d38b","Type":"ContainerStarted","Data":"c109a18000a463f941965facc7321c77423991e9abc308252989053addd9f995"} Dec 04 10:02:58 crc kubenswrapper[4693]: I1204 10:02:58.496670 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-zczb4" Dec 04 10:02:58 crc kubenswrapper[4693]: I1204 10:02:58.563815 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-zczb4" podStartSLOduration=43.603854514 podStartE2EDuration="51.563799444s" podCreationTimestamp="2025-12-04 10:02:07 +0000 UTC" firstStartedPulling="2025-12-04 10:02:47.084366731 +0000 UTC m=+1212.981960484" lastFinishedPulling="2025-12-04 10:02:55.044311661 +0000 UTC m=+1220.941905414" observedRunningTime="2025-12-04 10:02:58.56216276 +0000 UTC m=+1224.459756503" watchObservedRunningTime="2025-12-04 10:02:58.563799444 +0000 UTC m=+1224.461393197" Dec 04 10:02:59 crc kubenswrapper[4693]: I1204 10:02:59.514435 4693 generic.go:334] "Generic (PLEG): container finished" podID="cd544a1e-c7e1-4f04-90f5-d9cf152c4f12" containerID="dea3fa0c5eb5aa3eeddb3eca957bd14e997be8abb99f165bbff391e98b0e9e40" exitCode=0 Dec 04 10:02:59 crc kubenswrapper[4693]: I1204 10:02:59.514530 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7nh5c" event={"ID":"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12","Type":"ContainerDied","Data":"dea3fa0c5eb5aa3eeddb3eca957bd14e997be8abb99f165bbff391e98b0e9e40"} Dec 04 10:03:00 crc kubenswrapper[4693]: I1204 10:03:00.532074 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7nh5c" event={"ID":"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12","Type":"ContainerStarted","Data":"fd92dfe52712b01fc463d7d47afe023c2e0fbbded7cb84fd6717d47ac0c79a17"} Dec 04 10:03:00 crc kubenswrapper[4693]: I1204 10:03:00.536228 4693 generic.go:334] "Generic (PLEG): container finished" podID="b85d3f3e-5811-4829-8b36-96ecb7f22492" containerID="45043947bcba0bebdc06a917604b39b01ca1ced729ec566ca212700e41000845" exitCode=0 Dec 04 10:03:00 crc kubenswrapper[4693]: I1204 10:03:00.536260 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b85d3f3e-5811-4829-8b36-96ecb7f22492","Type":"ContainerDied","Data":"45043947bcba0bebdc06a917604b39b01ca1ced729ec566ca212700e41000845"} Dec 04 10:03:02 crc kubenswrapper[4693]: I1204 10:03:02.078543 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 04 10:03:03 crc kubenswrapper[4693]: I1204 10:03:03.560365 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7nh5c" event={"ID":"cd544a1e-c7e1-4f04-90f5-d9cf152c4f12","Type":"ContainerStarted","Data":"8e803c275c503145984aa86f0c77b37a83bf086c6285b9352a93929d4ff42f36"} Dec 04 10:03:03 crc kubenswrapper[4693]: I1204 10:03:03.560757 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:03:03 crc kubenswrapper[4693]: I1204 10:03:03.560769 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:03:03 crc kubenswrapper[4693]: I1204 10:03:03.578423 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b85d3f3e-5811-4829-8b36-96ecb7f22492","Type":"ContainerStarted","Data":"f22f5c51691f2ecb19aa9bacb0a579a759dfe5822ae56155adb7fd56485188c3"} Dec 04 10:03:03 crc kubenswrapper[4693]: I1204 10:03:03.597100 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-7nh5c" podStartSLOduration=48.646337521 podStartE2EDuration="56.593231867s" podCreationTimestamp="2025-12-04 10:02:07 +0000 UTC" firstStartedPulling="2025-12-04 10:02:47.086746705 +0000 UTC m=+1212.984340458" lastFinishedPulling="2025-12-04 10:02:55.033641051 +0000 UTC m=+1220.931234804" observedRunningTime="2025-12-04 10:03:03.585521829 +0000 UTC m=+1229.483115582" watchObservedRunningTime="2025-12-04 10:03:03.593231867 +0000 UTC m=+1229.490825620" Dec 04 10:03:03 crc kubenswrapper[4693]: I1204 10:03:03.608909 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.755685981 podStartE2EDuration="1m3.608891863s" podCreationTimestamp="2025-12-04 10:02:00 +0000 UTC" firstStartedPulling="2025-12-04 10:02:02.777715752 +0000 UTC m=+1168.675309495" lastFinishedPulling="2025-12-04 10:02:45.630921624 +0000 UTC m=+1211.528515377" observedRunningTime="2025-12-04 10:03:03.604753831 +0000 UTC m=+1229.502347584" watchObservedRunningTime="2025-12-04 10:03:03.608891863 +0000 UTC m=+1229.506485606" Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.257547 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.373961 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9js7w"] Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.424488 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-rvczs"] Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.428392 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-rvczs" Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.440196 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-rvczs"] Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.587806 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"14639c36-341c-4f90-980b-b9fffce3c8f8","Type":"ContainerStarted","Data":"d65f58ddba78fb7aa00ce10b173204b934d72102c581fa19a98080e1fbc8cf76"} Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.589271 4693 generic.go:334] "Generic (PLEG): container finished" podID="3e20e730-0b42-4aad-bcb0-6f0b77c04c3a" containerID="ca7d08f1999f4266aa35c26aeee7f00c0718c80f612d23ae5508f6ae6d18a580" exitCode=0 Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.589374 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jl2ft" event={"ID":"3e20e730-0b42-4aad-bcb0-6f0b77c04c3a","Type":"ContainerDied","Data":"ca7d08f1999f4266aa35c26aeee7f00c0718c80f612d23ae5508f6ae6d18a580"} Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.590832 4693 generic.go:334] "Generic (PLEG): container finished" podID="0ecabc57-5f50-49ef-8d1b-c8daa268642f" containerID="cda2e834b1e0934271da0fcb5b08a375a0fa6d6f8d70ee2f7bff44afaeb235f9" exitCode=0 Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.590997 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9js7w" event={"ID":"0ecabc57-5f50-49ef-8d1b-c8daa268642f","Type":"ContainerDied","Data":"cda2e834b1e0934271da0fcb5b08a375a0fa6d6f8d70ee2f7bff44afaeb235f9"} Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.609767 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/358ef7ce-74ba-4b1d-a654-7f82cd36bb80-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-rvczs\" (UID: \"358ef7ce-74ba-4b1d-a654-7f82cd36bb80\") " pod="openstack/dnsmasq-dns-7cb5889db5-rvczs" Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.610219 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/358ef7ce-74ba-4b1d-a654-7f82cd36bb80-config\") pod \"dnsmasq-dns-7cb5889db5-rvczs\" (UID: \"358ef7ce-74ba-4b1d-a654-7f82cd36bb80\") " pod="openstack/dnsmasq-dns-7cb5889db5-rvczs" Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.610268 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7vzh\" (UniqueName: \"kubernetes.io/projected/358ef7ce-74ba-4b1d-a654-7f82cd36bb80-kube-api-access-c7vzh\") pod \"dnsmasq-dns-7cb5889db5-rvczs\" (UID: \"358ef7ce-74ba-4b1d-a654-7f82cd36bb80\") " pod="openstack/dnsmasq-dns-7cb5889db5-rvczs" Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.614568 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=41.419481689 podStartE2EDuration="57.614539963s" podCreationTimestamp="2025-12-04 10:02:07 +0000 UTC" firstStartedPulling="2025-12-04 10:02:47.084351101 +0000 UTC m=+1212.981944854" lastFinishedPulling="2025-12-04 10:03:03.279409375 +0000 UTC m=+1229.177003128" observedRunningTime="2025-12-04 10:03:04.607642326 +0000 UTC m=+1230.505236079" watchObservedRunningTime="2025-12-04 10:03:04.614539963 +0000 UTC m=+1230.512133746" Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.712384 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/358ef7ce-74ba-4b1d-a654-7f82cd36bb80-config\") pod \"dnsmasq-dns-7cb5889db5-rvczs\" (UID: \"358ef7ce-74ba-4b1d-a654-7f82cd36bb80\") " pod="openstack/dnsmasq-dns-7cb5889db5-rvczs" Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.712437 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7vzh\" (UniqueName: \"kubernetes.io/projected/358ef7ce-74ba-4b1d-a654-7f82cd36bb80-kube-api-access-c7vzh\") pod \"dnsmasq-dns-7cb5889db5-rvczs\" (UID: \"358ef7ce-74ba-4b1d-a654-7f82cd36bb80\") " pod="openstack/dnsmasq-dns-7cb5889db5-rvczs" Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.712509 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/358ef7ce-74ba-4b1d-a654-7f82cd36bb80-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-rvczs\" (UID: \"358ef7ce-74ba-4b1d-a654-7f82cd36bb80\") " pod="openstack/dnsmasq-dns-7cb5889db5-rvczs" Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.713580 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/358ef7ce-74ba-4b1d-a654-7f82cd36bb80-config\") pod \"dnsmasq-dns-7cb5889db5-rvczs\" (UID: \"358ef7ce-74ba-4b1d-a654-7f82cd36bb80\") " pod="openstack/dnsmasq-dns-7cb5889db5-rvczs" Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.713936 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/358ef7ce-74ba-4b1d-a654-7f82cd36bb80-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-rvczs\" (UID: \"358ef7ce-74ba-4b1d-a654-7f82cd36bb80\") " pod="openstack/dnsmasq-dns-7cb5889db5-rvczs" Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.730753 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7vzh\" (UniqueName: \"kubernetes.io/projected/358ef7ce-74ba-4b1d-a654-7f82cd36bb80-kube-api-access-c7vzh\") pod \"dnsmasq-dns-7cb5889db5-rvczs\" (UID: \"358ef7ce-74ba-4b1d-a654-7f82cd36bb80\") " pod="openstack/dnsmasq-dns-7cb5889db5-rvczs" Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.749307 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-rvczs" Dec 04 10:03:04 crc kubenswrapper[4693]: I1204 10:03:04.947396 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9js7w" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.119264 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ecabc57-5f50-49ef-8d1b-c8daa268642f-dns-svc\") pod \"0ecabc57-5f50-49ef-8d1b-c8daa268642f\" (UID: \"0ecabc57-5f50-49ef-8d1b-c8daa268642f\") " Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.119561 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlds9\" (UniqueName: \"kubernetes.io/projected/0ecabc57-5f50-49ef-8d1b-c8daa268642f-kube-api-access-wlds9\") pod \"0ecabc57-5f50-49ef-8d1b-c8daa268642f\" (UID: \"0ecabc57-5f50-49ef-8d1b-c8daa268642f\") " Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.119657 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ecabc57-5f50-49ef-8d1b-c8daa268642f-config\") pod \"0ecabc57-5f50-49ef-8d1b-c8daa268642f\" (UID: \"0ecabc57-5f50-49ef-8d1b-c8daa268642f\") " Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.140531 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ecabc57-5f50-49ef-8d1b-c8daa268642f-kube-api-access-wlds9" (OuterVolumeSpecName: "kube-api-access-wlds9") pod "0ecabc57-5f50-49ef-8d1b-c8daa268642f" (UID: "0ecabc57-5f50-49ef-8d1b-c8daa268642f"). InnerVolumeSpecName "kube-api-access-wlds9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.142124 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ecabc57-5f50-49ef-8d1b-c8daa268642f-config" (OuterVolumeSpecName: "config") pod "0ecabc57-5f50-49ef-8d1b-c8daa268642f" (UID: "0ecabc57-5f50-49ef-8d1b-c8daa268642f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.153245 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ecabc57-5f50-49ef-8d1b-c8daa268642f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ecabc57-5f50-49ef-8d1b-c8daa268642f" (UID: "0ecabc57-5f50-49ef-8d1b-c8daa268642f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.222001 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ecabc57-5f50-49ef-8d1b-c8daa268642f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.222058 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlds9\" (UniqueName: \"kubernetes.io/projected/0ecabc57-5f50-49ef-8d1b-c8daa268642f-kube-api-access-wlds9\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.222075 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ecabc57-5f50-49ef-8d1b-c8daa268642f-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.274735 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-rvczs"] Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.536506 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 04 10:03:05 crc kubenswrapper[4693]: E1204 10:03:05.537310 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ecabc57-5f50-49ef-8d1b-c8daa268642f" containerName="init" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.537348 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ecabc57-5f50-49ef-8d1b-c8daa268642f" containerName="init" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.537666 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ecabc57-5f50-49ef-8d1b-c8daa268642f" containerName="init" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.544364 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.547872 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.548245 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.548263 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-9dzhf" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.550260 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.570585 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.602264 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9js7w" event={"ID":"0ecabc57-5f50-49ef-8d1b-c8daa268642f","Type":"ContainerDied","Data":"44dddc0587ef9ccc05e7966f7571bc7e084cf397cfab99e9458e591799c0f65b"} Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.602322 4693 scope.go:117] "RemoveContainer" containerID="cda2e834b1e0934271da0fcb5b08a375a0fa6d6f8d70ee2f7bff44afaeb235f9" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.602453 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9js7w" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.606462 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-rvczs" event={"ID":"358ef7ce-74ba-4b1d-a654-7f82cd36bb80","Type":"ContainerStarted","Data":"798ec1f83d5dd4e15a1ac41b2cb1caa7b8781b4ed42077134805d7d3b11a4d5a"} Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.606495 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-rvczs" event={"ID":"358ef7ce-74ba-4b1d-a654-7f82cd36bb80","Type":"ContainerStarted","Data":"6d1d3e5fdb8ca9efa67ed17b1cbbd119789f4efbb0a1cb8bee187aec4dbc9502"} Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.628485 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.628635 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.628660 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/73554998-24a4-4d23-a78d-66d51cbe24af-lock\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.628704 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svgzn\" (UniqueName: \"kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-kube-api-access-svgzn\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.628722 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/73554998-24a4-4d23-a78d-66d51cbe24af-cache\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:05 crc kubenswrapper[4693]: E1204 10:03:05.636574 4693 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 04 10:03:05 crc kubenswrapper[4693]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 04 10:03:05 crc kubenswrapper[4693]: > podSandboxID="c262cd2d8f2f88b89e37f59d85176423f77dfc59a448995dd99d5c29bb0738ab" Dec 04 10:03:05 crc kubenswrapper[4693]: E1204 10:03:05.636750 4693 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 04 10:03:05 crc kubenswrapper[4693]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vk9zl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-jl2ft_openstack(3e20e730-0b42-4aad-bcb0-6f0b77c04c3a): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 04 10:03:05 crc kubenswrapper[4693]: > logger="UnhandledError" Dec 04 10:03:05 crc kubenswrapper[4693]: E1204 10:03:05.638140 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-jl2ft" podUID="3e20e730-0b42-4aad-bcb0-6f0b77c04c3a" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.696309 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9js7w"] Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.703479 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9js7w"] Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.730623 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svgzn\" (UniqueName: \"kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-kube-api-access-svgzn\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.730684 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/73554998-24a4-4d23-a78d-66d51cbe24af-cache\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.730792 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.730917 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.730945 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/73554998-24a4-4d23-a78d-66d51cbe24af-lock\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.731480 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/73554998-24a4-4d23-a78d-66d51cbe24af-lock\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:05 crc kubenswrapper[4693]: E1204 10:03:05.731681 4693 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 10:03:05 crc kubenswrapper[4693]: E1204 10:03:05.731704 4693 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.731731 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Dec 04 10:03:05 crc kubenswrapper[4693]: E1204 10:03:05.731754 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift podName:73554998-24a4-4d23-a78d-66d51cbe24af nodeName:}" failed. No retries permitted until 2025-12-04 10:03:06.23173668 +0000 UTC m=+1232.129330443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift") pod "swift-storage-0" (UID: "73554998-24a4-4d23-a78d-66d51cbe24af") : configmap "swift-ring-files" not found Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.732242 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/73554998-24a4-4d23-a78d-66d51cbe24af-cache\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.751639 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svgzn\" (UniqueName: \"kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-kube-api-access-svgzn\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:05 crc kubenswrapper[4693]: I1204 10:03:05.763149 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:06 crc kubenswrapper[4693]: I1204 10:03:06.035280 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 04 10:03:06 crc kubenswrapper[4693]: I1204 10:03:06.080103 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 04 10:03:06 crc kubenswrapper[4693]: I1204 10:03:06.238408 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:06 crc kubenswrapper[4693]: E1204 10:03:06.238599 4693 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 10:03:06 crc kubenswrapper[4693]: E1204 10:03:06.238634 4693 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 10:03:06 crc kubenswrapper[4693]: E1204 10:03:06.238709 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift podName:73554998-24a4-4d23-a78d-66d51cbe24af nodeName:}" failed. No retries permitted until 2025-12-04 10:03:07.238686182 +0000 UTC m=+1233.136279935 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift") pod "swift-storage-0" (UID: "73554998-24a4-4d23-a78d-66d51cbe24af") : configmap "swift-ring-files" not found Dec 04 10:03:06 crc kubenswrapper[4693]: I1204 10:03:06.472454 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ecabc57-5f50-49ef-8d1b-c8daa268642f" path="/var/lib/kubelet/pods/0ecabc57-5f50-49ef-8d1b-c8daa268642f/volumes" Dec 04 10:03:06 crc kubenswrapper[4693]: I1204 10:03:06.613696 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 04 10:03:06 crc kubenswrapper[4693]: I1204 10:03:06.660114 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 04 10:03:06 crc kubenswrapper[4693]: I1204 10:03:06.961988 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jl2ft"] Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.000743 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-rc6f6"] Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.008316 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.013205 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.023556 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-clqww"] Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.025197 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.030624 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.044103 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-rc6f6"] Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.056183 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxb5b\" (UniqueName: \"kubernetes.io/projected/07162790-3103-4a01-ba8d-fb948a5d57d4-kube-api-access-hxb5b\") pod \"dnsmasq-dns-57d65f699f-rc6f6\" (UID: \"07162790-3103-4a01-ba8d-fb948a5d57d4\") " pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.056256 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07162790-3103-4a01-ba8d-fb948a5d57d4-ovsdbserver-nb\") pod \"dnsmasq-dns-57d65f699f-rc6f6\" (UID: \"07162790-3103-4a01-ba8d-fb948a5d57d4\") " pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.056395 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07162790-3103-4a01-ba8d-fb948a5d57d4-config\") pod \"dnsmasq-dns-57d65f699f-rc6f6\" (UID: \"07162790-3103-4a01-ba8d-fb948a5d57d4\") " pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.056430 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07162790-3103-4a01-ba8d-fb948a5d57d4-dns-svc\") pod \"dnsmasq-dns-57d65f699f-rc6f6\" (UID: \"07162790-3103-4a01-ba8d-fb948a5d57d4\") " pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.093534 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-clqww"] Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.157979 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07162790-3103-4a01-ba8d-fb948a5d57d4-config\") pod \"dnsmasq-dns-57d65f699f-rc6f6\" (UID: \"07162790-3103-4a01-ba8d-fb948a5d57d4\") " pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.158026 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07162790-3103-4a01-ba8d-fb948a5d57d4-dns-svc\") pod \"dnsmasq-dns-57d65f699f-rc6f6\" (UID: \"07162790-3103-4a01-ba8d-fb948a5d57d4\") " pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.158067 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e8b66ffe-c672-438c-ab15-a4a44563152d-ovs-rundir\") pod \"ovn-controller-metrics-clqww\" (UID: \"e8b66ffe-c672-438c-ab15-a4a44563152d\") " pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.158082 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8b66ffe-c672-438c-ab15-a4a44563152d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-clqww\" (UID: \"e8b66ffe-c672-438c-ab15-a4a44563152d\") " pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.158107 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxb5b\" (UniqueName: \"kubernetes.io/projected/07162790-3103-4a01-ba8d-fb948a5d57d4-kube-api-access-hxb5b\") pod \"dnsmasq-dns-57d65f699f-rc6f6\" (UID: \"07162790-3103-4a01-ba8d-fb948a5d57d4\") " pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.158124 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8b66ffe-c672-438c-ab15-a4a44563152d-combined-ca-bundle\") pod \"ovn-controller-metrics-clqww\" (UID: \"e8b66ffe-c672-438c-ab15-a4a44563152d\") " pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.158143 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e8b66ffe-c672-438c-ab15-a4a44563152d-ovn-rundir\") pod \"ovn-controller-metrics-clqww\" (UID: \"e8b66ffe-c672-438c-ab15-a4a44563152d\") " pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.158165 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07162790-3103-4a01-ba8d-fb948a5d57d4-ovsdbserver-nb\") pod \"dnsmasq-dns-57d65f699f-rc6f6\" (UID: \"07162790-3103-4a01-ba8d-fb948a5d57d4\") " pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.158211 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvcgn\" (UniqueName: \"kubernetes.io/projected/e8b66ffe-c672-438c-ab15-a4a44563152d-kube-api-access-dvcgn\") pod \"ovn-controller-metrics-clqww\" (UID: \"e8b66ffe-c672-438c-ab15-a4a44563152d\") " pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.158236 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8b66ffe-c672-438c-ab15-a4a44563152d-config\") pod \"ovn-controller-metrics-clqww\" (UID: \"e8b66ffe-c672-438c-ab15-a4a44563152d\") " pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.159191 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07162790-3103-4a01-ba8d-fb948a5d57d4-config\") pod \"dnsmasq-dns-57d65f699f-rc6f6\" (UID: \"07162790-3103-4a01-ba8d-fb948a5d57d4\") " pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.159727 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07162790-3103-4a01-ba8d-fb948a5d57d4-dns-svc\") pod \"dnsmasq-dns-57d65f699f-rc6f6\" (UID: \"07162790-3103-4a01-ba8d-fb948a5d57d4\") " pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.160575 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07162790-3103-4a01-ba8d-fb948a5d57d4-ovsdbserver-nb\") pod \"dnsmasq-dns-57d65f699f-rc6f6\" (UID: \"07162790-3103-4a01-ba8d-fb948a5d57d4\") " pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.185214 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxb5b\" (UniqueName: \"kubernetes.io/projected/07162790-3103-4a01-ba8d-fb948a5d57d4-kube-api-access-hxb5b\") pod \"dnsmasq-dns-57d65f699f-rc6f6\" (UID: \"07162790-3103-4a01-ba8d-fb948a5d57d4\") " pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.259423 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e8b66ffe-c672-438c-ab15-a4a44563152d-ovn-rundir\") pod \"ovn-controller-metrics-clqww\" (UID: \"e8b66ffe-c672-438c-ab15-a4a44563152d\") " pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.259543 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvcgn\" (UniqueName: \"kubernetes.io/projected/e8b66ffe-c672-438c-ab15-a4a44563152d-kube-api-access-dvcgn\") pod \"ovn-controller-metrics-clqww\" (UID: \"e8b66ffe-c672-438c-ab15-a4a44563152d\") " pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.259581 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8b66ffe-c672-438c-ab15-a4a44563152d-config\") pod \"ovn-controller-metrics-clqww\" (UID: \"e8b66ffe-c672-438c-ab15-a4a44563152d\") " pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.259632 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.259694 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e8b66ffe-c672-438c-ab15-a4a44563152d-ovs-rundir\") pod \"ovn-controller-metrics-clqww\" (UID: \"e8b66ffe-c672-438c-ab15-a4a44563152d\") " pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.259716 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8b66ffe-c672-438c-ab15-a4a44563152d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-clqww\" (UID: \"e8b66ffe-c672-438c-ab15-a4a44563152d\") " pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.259750 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8b66ffe-c672-438c-ab15-a4a44563152d-combined-ca-bundle\") pod \"ovn-controller-metrics-clqww\" (UID: \"e8b66ffe-c672-438c-ab15-a4a44563152d\") " pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.260828 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e8b66ffe-c672-438c-ab15-a4a44563152d-ovn-rundir\") pod \"ovn-controller-metrics-clqww\" (UID: \"e8b66ffe-c672-438c-ab15-a4a44563152d\") " pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.261154 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e8b66ffe-c672-438c-ab15-a4a44563152d-ovs-rundir\") pod \"ovn-controller-metrics-clqww\" (UID: \"e8b66ffe-c672-438c-ab15-a4a44563152d\") " pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: E1204 10:03:07.261228 4693 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 10:03:07 crc kubenswrapper[4693]: E1204 10:03:07.261240 4693 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 10:03:07 crc kubenswrapper[4693]: E1204 10:03:07.261276 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift podName:73554998-24a4-4d23-a78d-66d51cbe24af nodeName:}" failed. No retries permitted until 2025-12-04 10:03:09.261261981 +0000 UTC m=+1235.158855734 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift") pod "swift-storage-0" (UID: "73554998-24a4-4d23-a78d-66d51cbe24af") : configmap "swift-ring-files" not found Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.261318 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8b66ffe-c672-438c-ab15-a4a44563152d-config\") pod \"ovn-controller-metrics-clqww\" (UID: \"e8b66ffe-c672-438c-ab15-a4a44563152d\") " pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.265147 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8b66ffe-c672-438c-ab15-a4a44563152d-combined-ca-bundle\") pod \"ovn-controller-metrics-clqww\" (UID: \"e8b66ffe-c672-438c-ab15-a4a44563152d\") " pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.266547 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8b66ffe-c672-438c-ab15-a4a44563152d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-clqww\" (UID: \"e8b66ffe-c672-438c-ab15-a4a44563152d\") " pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.280380 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvcgn\" (UniqueName: \"kubernetes.io/projected/e8b66ffe-c672-438c-ab15-a4a44563152d-kube-api-access-dvcgn\") pod \"ovn-controller-metrics-clqww\" (UID: \"e8b66ffe-c672-438c-ab15-a4a44563152d\") " pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.332268 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.371698 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-rvczs"] Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.373681 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-clqww" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.395962 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-8vk68"] Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.396678 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jl2ft" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.397297 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.401013 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.409711 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-8vk68"] Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.463668 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a-dns-svc\") pod \"3e20e730-0b42-4aad-bcb0-6f0b77c04c3a\" (UID: \"3e20e730-0b42-4aad-bcb0-6f0b77c04c3a\") " Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.463827 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a-config\") pod \"3e20e730-0b42-4aad-bcb0-6f0b77c04c3a\" (UID: \"3e20e730-0b42-4aad-bcb0-6f0b77c04c3a\") " Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.463949 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk9zl\" (UniqueName: \"kubernetes.io/projected/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a-kube-api-access-vk9zl\") pod \"3e20e730-0b42-4aad-bcb0-6f0b77c04c3a\" (UID: \"3e20e730-0b42-4aad-bcb0-6f0b77c04c3a\") " Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.464242 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndwzg\" (UniqueName: \"kubernetes.io/projected/334678ba-c391-4eb7-a693-37bb8dde6c26-kube-api-access-ndwzg\") pod \"dnsmasq-dns-b8fbc5445-8vk68\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.464286 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-8vk68\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.464442 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-8vk68\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.464473 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-config\") pod \"dnsmasq-dns-b8fbc5445-8vk68\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.464539 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-8vk68\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.469142 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a-kube-api-access-vk9zl" (OuterVolumeSpecName: "kube-api-access-vk9zl") pod "3e20e730-0b42-4aad-bcb0-6f0b77c04c3a" (UID: "3e20e730-0b42-4aad-bcb0-6f0b77c04c3a"). InnerVolumeSpecName "kube-api-access-vk9zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.514682 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3e20e730-0b42-4aad-bcb0-6f0b77c04c3a" (UID: "3e20e730-0b42-4aad-bcb0-6f0b77c04c3a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.514784 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a-config" (OuterVolumeSpecName: "config") pod "3e20e730-0b42-4aad-bcb0-6f0b77c04c3a" (UID: "3e20e730-0b42-4aad-bcb0-6f0b77c04c3a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.566793 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndwzg\" (UniqueName: \"kubernetes.io/projected/334678ba-c391-4eb7-a693-37bb8dde6c26-kube-api-access-ndwzg\") pod \"dnsmasq-dns-b8fbc5445-8vk68\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.566847 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-8vk68\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.566926 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-8vk68\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.566951 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-config\") pod \"dnsmasq-dns-b8fbc5445-8vk68\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.566991 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-8vk68\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.567048 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.567059 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk9zl\" (UniqueName: \"kubernetes.io/projected/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a-kube-api-access-vk9zl\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.567071 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.567854 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-8vk68\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.568772 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-8vk68\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.570599 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-8vk68\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.572697 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-config\") pod \"dnsmasq-dns-b8fbc5445-8vk68\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.580922 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndwzg\" (UniqueName: \"kubernetes.io/projected/334678ba-c391-4eb7-a693-37bb8dde6c26-kube-api-access-ndwzg\") pod \"dnsmasq-dns-b8fbc5445-8vk68\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.621835 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jl2ft" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.622050 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jl2ft" event={"ID":"3e20e730-0b42-4aad-bcb0-6f0b77c04c3a","Type":"ContainerDied","Data":"c262cd2d8f2f88b89e37f59d85176423f77dfc59a448995dd99d5c29bb0738ab"} Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.622100 4693 scope.go:117] "RemoveContainer" containerID="ca7d08f1999f4266aa35c26aeee7f00c0718c80f612d23ae5508f6ae6d18a580" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.703593 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jl2ft"] Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.713964 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.719399 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jl2ft"] Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.834765 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-rc6f6"] Dec 04 10:03:07 crc kubenswrapper[4693]: W1204 10:03:07.840997 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07162790_3103_4a01_ba8d_fb948a5d57d4.slice/crio-e68f839f0388973d7d39cd502166815c9ffeb735d2a40a9cbab86d3ebb2bf554 WatchSource:0}: Error finding container e68f839f0388973d7d39cd502166815c9ffeb735d2a40a9cbab86d3ebb2bf554: Status 404 returned error can't find the container with id e68f839f0388973d7d39cd502166815c9ffeb735d2a40a9cbab86d3ebb2bf554 Dec 04 10:03:07 crc kubenswrapper[4693]: I1204 10:03:07.909971 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-clqww"] Dec 04 10:03:08 crc kubenswrapper[4693]: I1204 10:03:08.140570 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-8vk68"] Dec 04 10:03:08 crc kubenswrapper[4693]: W1204 10:03:08.146057 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod334678ba_c391_4eb7_a693_37bb8dde6c26.slice/crio-409963b736a1973c6fc765a11a800168eda51124720dda32d86376488c3febc9 WatchSource:0}: Error finding container 409963b736a1973c6fc765a11a800168eda51124720dda32d86376488c3febc9: Status 404 returned error can't find the container with id 409963b736a1973c6fc765a11a800168eda51124720dda32d86376488c3febc9 Dec 04 10:03:08 crc kubenswrapper[4693]: I1204 10:03:08.471118 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e20e730-0b42-4aad-bcb0-6f0b77c04c3a" path="/var/lib/kubelet/pods/3e20e730-0b42-4aad-bcb0-6f0b77c04c3a/volumes" Dec 04 10:03:08 crc kubenswrapper[4693]: I1204 10:03:08.632000 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" event={"ID":"07162790-3103-4a01-ba8d-fb948a5d57d4","Type":"ContainerStarted","Data":"e68f839f0388973d7d39cd502166815c9ffeb735d2a40a9cbab86d3ebb2bf554"} Dec 04 10:03:08 crc kubenswrapper[4693]: I1204 10:03:08.633408 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" event={"ID":"334678ba-c391-4eb7-a693-37bb8dde6c26","Type":"ContainerStarted","Data":"409963b736a1973c6fc765a11a800168eda51124720dda32d86376488c3febc9"} Dec 04 10:03:08 crc kubenswrapper[4693]: I1204 10:03:08.636314 4693 generic.go:334] "Generic (PLEG): container finished" podID="358ef7ce-74ba-4b1d-a654-7f82cd36bb80" containerID="798ec1f83d5dd4e15a1ac41b2cb1caa7b8781b4ed42077134805d7d3b11a4d5a" exitCode=0 Dec 04 10:03:08 crc kubenswrapper[4693]: I1204 10:03:08.636369 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-rvczs" event={"ID":"358ef7ce-74ba-4b1d-a654-7f82cd36bb80","Type":"ContainerDied","Data":"798ec1f83d5dd4e15a1ac41b2cb1caa7b8781b4ed42077134805d7d3b11a4d5a"} Dec 04 10:03:08 crc kubenswrapper[4693]: I1204 10:03:08.637707 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-clqww" event={"ID":"e8b66ffe-c672-438c-ab15-a4a44563152d","Type":"ContainerStarted","Data":"2814873477df3125dc34664947eb6c408eb15388488876a0f15060c6da0fda58"} Dec 04 10:03:08 crc kubenswrapper[4693]: I1204 10:03:08.988670 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-rvczs" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.089083 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/358ef7ce-74ba-4b1d-a654-7f82cd36bb80-config\") pod \"358ef7ce-74ba-4b1d-a654-7f82cd36bb80\" (UID: \"358ef7ce-74ba-4b1d-a654-7f82cd36bb80\") " Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.089462 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7vzh\" (UniqueName: \"kubernetes.io/projected/358ef7ce-74ba-4b1d-a654-7f82cd36bb80-kube-api-access-c7vzh\") pod \"358ef7ce-74ba-4b1d-a654-7f82cd36bb80\" (UID: \"358ef7ce-74ba-4b1d-a654-7f82cd36bb80\") " Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.089573 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/358ef7ce-74ba-4b1d-a654-7f82cd36bb80-dns-svc\") pod \"358ef7ce-74ba-4b1d-a654-7f82cd36bb80\" (UID: \"358ef7ce-74ba-4b1d-a654-7f82cd36bb80\") " Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.095920 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/358ef7ce-74ba-4b1d-a654-7f82cd36bb80-kube-api-access-c7vzh" (OuterVolumeSpecName: "kube-api-access-c7vzh") pod "358ef7ce-74ba-4b1d-a654-7f82cd36bb80" (UID: "358ef7ce-74ba-4b1d-a654-7f82cd36bb80"). InnerVolumeSpecName "kube-api-access-c7vzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.118696 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/358ef7ce-74ba-4b1d-a654-7f82cd36bb80-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "358ef7ce-74ba-4b1d-a654-7f82cd36bb80" (UID: "358ef7ce-74ba-4b1d-a654-7f82cd36bb80"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.119195 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/358ef7ce-74ba-4b1d-a654-7f82cd36bb80-config" (OuterVolumeSpecName: "config") pod "358ef7ce-74ba-4b1d-a654-7f82cd36bb80" (UID: "358ef7ce-74ba-4b1d-a654-7f82cd36bb80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.191284 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7vzh\" (UniqueName: \"kubernetes.io/projected/358ef7ce-74ba-4b1d-a654-7f82cd36bb80-kube-api-access-c7vzh\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.191371 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/358ef7ce-74ba-4b1d-a654-7f82cd36bb80-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.191389 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/358ef7ce-74ba-4b1d-a654-7f82cd36bb80-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.292575 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:09 crc kubenswrapper[4693]: E1204 10:03:09.292768 4693 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 10:03:09 crc kubenswrapper[4693]: E1204 10:03:09.292789 4693 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 10:03:09 crc kubenswrapper[4693]: E1204 10:03:09.292833 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift podName:73554998-24a4-4d23-a78d-66d51cbe24af nodeName:}" failed. No retries permitted until 2025-12-04 10:03:13.292816961 +0000 UTC m=+1239.190410714 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift") pod "swift-storage-0" (UID: "73554998-24a4-4d23-a78d-66d51cbe24af") : configmap "swift-ring-files" not found Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.467577 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-xz7js"] Dec 04 10:03:09 crc kubenswrapper[4693]: E1204 10:03:09.468181 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="358ef7ce-74ba-4b1d-a654-7f82cd36bb80" containerName="init" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.468256 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="358ef7ce-74ba-4b1d-a654-7f82cd36bb80" containerName="init" Dec 04 10:03:09 crc kubenswrapper[4693]: E1204 10:03:09.468402 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e20e730-0b42-4aad-bcb0-6f0b77c04c3a" containerName="init" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.468484 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e20e730-0b42-4aad-bcb0-6f0b77c04c3a" containerName="init" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.468751 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e20e730-0b42-4aad-bcb0-6f0b77c04c3a" containerName="init" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.468869 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="358ef7ce-74ba-4b1d-a654-7f82cd36bb80" containerName="init" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.469701 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.472982 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.473719 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.481929 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.496054 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-xz7js"] Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.497137 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/39170f53-93c9-49fd-8dba-42d325269e74-ring-data-devices\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.497180 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxhfj\" (UniqueName: \"kubernetes.io/projected/39170f53-93c9-49fd-8dba-42d325269e74-kube-api-access-vxhfj\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.497288 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/39170f53-93c9-49fd-8dba-42d325269e74-swiftconf\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.497364 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39170f53-93c9-49fd-8dba-42d325269e74-scripts\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.497752 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/39170f53-93c9-49fd-8dba-42d325269e74-dispersionconf\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.497885 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/39170f53-93c9-49fd-8dba-42d325269e74-etc-swift\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.498108 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39170f53-93c9-49fd-8dba-42d325269e74-combined-ca-bundle\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.598608 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/39170f53-93c9-49fd-8dba-42d325269e74-dispersionconf\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.599095 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/39170f53-93c9-49fd-8dba-42d325269e74-etc-swift\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.599157 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39170f53-93c9-49fd-8dba-42d325269e74-combined-ca-bundle\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.599204 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/39170f53-93c9-49fd-8dba-42d325269e74-ring-data-devices\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.599225 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxhfj\" (UniqueName: \"kubernetes.io/projected/39170f53-93c9-49fd-8dba-42d325269e74-kube-api-access-vxhfj\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.599254 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/39170f53-93c9-49fd-8dba-42d325269e74-swiftconf\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.599278 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39170f53-93c9-49fd-8dba-42d325269e74-scripts\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.600017 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39170f53-93c9-49fd-8dba-42d325269e74-scripts\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.600445 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/39170f53-93c9-49fd-8dba-42d325269e74-etc-swift\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.600614 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/39170f53-93c9-49fd-8dba-42d325269e74-ring-data-devices\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.603531 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/39170f53-93c9-49fd-8dba-42d325269e74-swiftconf\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.604114 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39170f53-93c9-49fd-8dba-42d325269e74-combined-ca-bundle\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.605691 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/39170f53-93c9-49fd-8dba-42d325269e74-dispersionconf\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.622905 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxhfj\" (UniqueName: \"kubernetes.io/projected/39170f53-93c9-49fd-8dba-42d325269e74-kube-api-access-vxhfj\") pod \"swift-ring-rebalance-xz7js\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.648536 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-rvczs" event={"ID":"358ef7ce-74ba-4b1d-a654-7f82cd36bb80","Type":"ContainerDied","Data":"6d1d3e5fdb8ca9efa67ed17b1cbbd119789f4efbb0a1cb8bee187aec4dbc9502"} Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.648606 4693 scope.go:117] "RemoveContainer" containerID="798ec1f83d5dd4e15a1ac41b2cb1caa7b8781b4ed42077134805d7d3b11a4d5a" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.648605 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-rvczs" Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.758129 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-rvczs"] Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.766416 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-rvczs"] Dec 04 10:03:09 crc kubenswrapper[4693]: I1204 10:03:09.790956 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:10 crc kubenswrapper[4693]: I1204 10:03:10.477573 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="358ef7ce-74ba-4b1d-a654-7f82cd36bb80" path="/var/lib/kubelet/pods/358ef7ce-74ba-4b1d-a654-7f82cd36bb80/volumes" Dec 04 10:03:10 crc kubenswrapper[4693]: I1204 10:03:10.658702 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9999b422-d127-4990-8091-9446e589839a","Type":"ContainerStarted","Data":"712ee523e05b6b1228fa567e0ff8c9c701c28c0d366ee27ecfdbda52cc17937b"} Dec 04 10:03:10 crc kubenswrapper[4693]: I1204 10:03:10.662549 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-clqww" event={"ID":"e8b66ffe-c672-438c-ab15-a4a44563152d","Type":"ContainerStarted","Data":"742a367682375dfcd5589b0c86cad7893d6c516e848dcfc4a0b923e61f646671"} Dec 04 10:03:10 crc kubenswrapper[4693]: I1204 10:03:10.664418 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" event={"ID":"07162790-3103-4a01-ba8d-fb948a5d57d4","Type":"ContainerStarted","Data":"595682c9f9402872e9595de887c92e45c410e93c9025eb8ba4e70d9245474e4a"} Dec 04 10:03:10 crc kubenswrapper[4693]: I1204 10:03:10.668575 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" event={"ID":"334678ba-c391-4eb7-a693-37bb8dde6c26","Type":"ContainerStarted","Data":"65c9483e3be87bc6966efc438372ef50398e641b2efc5a5336187e8627af09a1"} Dec 04 10:03:10 crc kubenswrapper[4693]: I1204 10:03:10.899213 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-xz7js"] Dec 04 10:03:11 crc kubenswrapper[4693]: I1204 10:03:11.675428 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xz7js" event={"ID":"39170f53-93c9-49fd-8dba-42d325269e74","Type":"ContainerStarted","Data":"d040bc4e150b780285f1072d99c87a097173b3261a57b26eeafbee0d127ef09a"} Dec 04 10:03:11 crc kubenswrapper[4693]: I1204 10:03:11.678528 4693 generic.go:334] "Generic (PLEG): container finished" podID="334678ba-c391-4eb7-a693-37bb8dde6c26" containerID="65c9483e3be87bc6966efc438372ef50398e641b2efc5a5336187e8627af09a1" exitCode=0 Dec 04 10:03:11 crc kubenswrapper[4693]: I1204 10:03:11.678582 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" event={"ID":"334678ba-c391-4eb7-a693-37bb8dde6c26","Type":"ContainerDied","Data":"65c9483e3be87bc6966efc438372ef50398e641b2efc5a5336187e8627af09a1"} Dec 04 10:03:11 crc kubenswrapper[4693]: I1204 10:03:11.679808 4693 generic.go:334] "Generic (PLEG): container finished" podID="07162790-3103-4a01-ba8d-fb948a5d57d4" containerID="595682c9f9402872e9595de887c92e45c410e93c9025eb8ba4e70d9245474e4a" exitCode=0 Dec 04 10:03:11 crc kubenswrapper[4693]: I1204 10:03:11.679848 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" event={"ID":"07162790-3103-4a01-ba8d-fb948a5d57d4","Type":"ContainerDied","Data":"595682c9f9402872e9595de887c92e45c410e93c9025eb8ba4e70d9245474e4a"} Dec 04 10:03:12 crc kubenswrapper[4693]: I1204 10:03:12.101784 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 04 10:03:12 crc kubenswrapper[4693]: I1204 10:03:12.102127 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 04 10:03:13 crc kubenswrapper[4693]: I1204 10:03:13.363829 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:13 crc kubenswrapper[4693]: E1204 10:03:13.364090 4693 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 10:03:13 crc kubenswrapper[4693]: E1204 10:03:13.364401 4693 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 10:03:13 crc kubenswrapper[4693]: E1204 10:03:13.364484 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift podName:73554998-24a4-4d23-a78d-66d51cbe24af nodeName:}" failed. No retries permitted until 2025-12-04 10:03:21.364458243 +0000 UTC m=+1247.262051996 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift") pod "swift-storage-0" (UID: "73554998-24a4-4d23-a78d-66d51cbe24af") : configmap "swift-ring-files" not found Dec 04 10:03:13 crc kubenswrapper[4693]: I1204 10:03:13.731131 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=46.701556614 podStartE2EDuration="1m3.731113876s" podCreationTimestamp="2025-12-04 10:02:10 +0000 UTC" firstStartedPulling="2025-12-04 10:02:47.088544424 +0000 UTC m=+1212.986138177" lastFinishedPulling="2025-12-04 10:03:04.118101686 +0000 UTC m=+1230.015695439" observedRunningTime="2025-12-04 10:03:13.729282115 +0000 UTC m=+1239.626875868" watchObservedRunningTime="2025-12-04 10:03:13.731113876 +0000 UTC m=+1239.628707629" Dec 04 10:03:13 crc kubenswrapper[4693]: I1204 10:03:13.755296 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-clqww" podStartSLOduration=7.755275612 podStartE2EDuration="7.755275612s" podCreationTimestamp="2025-12-04 10:03:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:03:13.747863638 +0000 UTC m=+1239.645457391" watchObservedRunningTime="2025-12-04 10:03:13.755275612 +0000 UTC m=+1239.652869365" Dec 04 10:03:14 crc kubenswrapper[4693]: I1204 10:03:14.706158 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" event={"ID":"334678ba-c391-4eb7-a693-37bb8dde6c26","Type":"ContainerStarted","Data":"e0228ed7b3806a077c1d1ecb90531aeadea88cbcd2c784cb1a25ac1bf04b7ee8"} Dec 04 10:03:14 crc kubenswrapper[4693]: I1204 10:03:14.706572 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:03:14 crc kubenswrapper[4693]: I1204 10:03:14.708991 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" event={"ID":"07162790-3103-4a01-ba8d-fb948a5d57d4","Type":"ContainerStarted","Data":"b65883d367c8f1186f7f3ad60905460e478d907704620a5e51611a0fc156a9fc"} Dec 04 10:03:14 crc kubenswrapper[4693]: I1204 10:03:14.709098 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" Dec 04 10:03:14 crc kubenswrapper[4693]: I1204 10:03:14.710638 4693 generic.go:334] "Generic (PLEG): container finished" podID="e55c8437-1394-45c9-b135-2dbe68895d38" containerID="dd1f6876e20c9783b69c77e74f7e360d48dc2a93e85260774dc928fc59ecfcea" exitCode=0 Dec 04 10:03:14 crc kubenswrapper[4693]: I1204 10:03:14.710667 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e55c8437-1394-45c9-b135-2dbe68895d38","Type":"ContainerDied","Data":"dd1f6876e20c9783b69c77e74f7e360d48dc2a93e85260774dc928fc59ecfcea"} Dec 04 10:03:14 crc kubenswrapper[4693]: I1204 10:03:14.730230 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" podStartSLOduration=7.7302131880000005 podStartE2EDuration="7.730213188s" podCreationTimestamp="2025-12-04 10:03:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:03:14.725040745 +0000 UTC m=+1240.622634508" watchObservedRunningTime="2025-12-04 10:03:14.730213188 +0000 UTC m=+1240.627806941" Dec 04 10:03:14 crc kubenswrapper[4693]: I1204 10:03:14.753559 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" podStartSLOduration=8.753538881 podStartE2EDuration="8.753538881s" podCreationTimestamp="2025-12-04 10:03:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:03:14.746324182 +0000 UTC m=+1240.643917935" watchObservedRunningTime="2025-12-04 10:03:14.753538881 +0000 UTC m=+1240.651132634" Dec 04 10:03:15 crc kubenswrapper[4693]: I1204 10:03:15.208135 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 04 10:03:15 crc kubenswrapper[4693]: I1204 10:03:15.252280 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 04 10:03:15 crc kubenswrapper[4693]: I1204 10:03:15.720526 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 04 10:03:15 crc kubenswrapper[4693]: I1204 10:03:15.756765 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 04 10:03:15 crc kubenswrapper[4693]: I1204 10:03:15.905352 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 04 10:03:15 crc kubenswrapper[4693]: I1204 10:03:15.906906 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 10:03:15 crc kubenswrapper[4693]: I1204 10:03:15.909251 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 04 10:03:15 crc kubenswrapper[4693]: I1204 10:03:15.909350 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 04 10:03:15 crc kubenswrapper[4693]: I1204 10:03:15.909406 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 04 10:03:15 crc kubenswrapper[4693]: I1204 10:03:15.909636 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-475dd" Dec 04 10:03:15 crc kubenswrapper[4693]: I1204 10:03:15.931957 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.016835 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.016876 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.016911 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.016940 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.017127 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-scripts\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.017253 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsvvr\" (UniqueName: \"kubernetes.io/projected/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-kube-api-access-jsvvr\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.017485 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-config\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.118782 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-scripts\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.118840 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsvvr\" (UniqueName: \"kubernetes.io/projected/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-kube-api-access-jsvvr\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.118914 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-config\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.118956 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.118973 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.118993 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.119015 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.119808 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-config\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.119952 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-scripts\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.120070 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.124250 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.124254 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.135140 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.139011 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsvvr\" (UniqueName: \"kubernetes.io/projected/eec5c741-f1c6-424f-b3e1-4f5219fa0bf0-kube-api-access-jsvvr\") pod \"ovn-northd-0\" (UID: \"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0\") " pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.226605 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.486963 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 10:03:16 crc kubenswrapper[4693]: I1204 10:03:16.730585 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0","Type":"ContainerStarted","Data":"06bb9762655acb844a2cbc28b6c8871faf034e4e74fadaa5cff8716d01a8a377"} Dec 04 10:03:21 crc kubenswrapper[4693]: I1204 10:03:21.428318 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:21 crc kubenswrapper[4693]: E1204 10:03:21.428635 4693 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 10:03:21 crc kubenswrapper[4693]: E1204 10:03:21.429404 4693 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 10:03:21 crc kubenswrapper[4693]: E1204 10:03:21.429491 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift podName:73554998-24a4-4d23-a78d-66d51cbe24af nodeName:}" failed. No retries permitted until 2025-12-04 10:03:37.429464813 +0000 UTC m=+1263.327058606 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift") pod "swift-storage-0" (UID: "73554998-24a4-4d23-a78d-66d51cbe24af") : configmap "swift-ring-files" not found Dec 04 10:03:21 crc kubenswrapper[4693]: I1204 10:03:21.782426 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e55c8437-1394-45c9-b135-2dbe68895d38","Type":"ContainerStarted","Data":"90b83043c352e5f08378545a78fd7ab0555e9e6e8389bf8077397b348b7c1a0f"} Dec 04 10:03:22 crc kubenswrapper[4693]: I1204 10:03:22.273229 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:03:22 crc kubenswrapper[4693]: I1204 10:03:22.273290 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:03:22 crc kubenswrapper[4693]: I1204 10:03:22.334561 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" Dec 04 10:03:22 crc kubenswrapper[4693]: I1204 10:03:22.716697 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:03:22 crc kubenswrapper[4693]: I1204 10:03:22.782162 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-rc6f6"] Dec 04 10:03:22 crc kubenswrapper[4693]: I1204 10:03:22.792540 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" podUID="07162790-3103-4a01-ba8d-fb948a5d57d4" containerName="dnsmasq-dns" containerID="cri-o://b65883d367c8f1186f7f3ad60905460e478d907704620a5e51611a0fc156a9fc" gracePeriod=10 Dec 04 10:03:27 crc kubenswrapper[4693]: I1204 10:03:27.333517 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" podUID="07162790-3103-4a01-ba8d-fb948a5d57d4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: connect: connection refused" Dec 04 10:03:28 crc kubenswrapper[4693]: I1204 10:03:28.341496 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zczb4" podUID="ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a" containerName="ovn-controller" probeResult="failure" output=< Dec 04 10:03:28 crc kubenswrapper[4693]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 04 10:03:28 crc kubenswrapper[4693]: > Dec 04 10:03:29 crc kubenswrapper[4693]: I1204 10:03:29.867459 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371945.98736 podStartE2EDuration="1m30.86741649s" podCreationTimestamp="2025-12-04 10:01:59 +0000 UTC" firstStartedPulling="2025-12-04 10:02:01.152990878 +0000 UTC m=+1167.050584631" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:03:29.866971217 +0000 UTC m=+1255.764564980" watchObservedRunningTime="2025-12-04 10:03:29.86741649 +0000 UTC m=+1255.765010243" Dec 04 10:03:30 crc kubenswrapper[4693]: I1204 10:03:30.666992 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 04 10:03:30 crc kubenswrapper[4693]: I1204 10:03:30.667079 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 04 10:03:30 crc kubenswrapper[4693]: I1204 10:03:30.855267 4693 generic.go:334] "Generic (PLEG): container finished" podID="2d1a11f6-b003-41f8-a2f1-010d7dae29d4" containerID="1b1cc8585e13a887662f01305afa00ea83a4300fcc52b4fcfcf4fc153d0a7e77" exitCode=0 Dec 04 10:03:30 crc kubenswrapper[4693]: I1204 10:03:30.855348 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2d1a11f6-b003-41f8-a2f1-010d7dae29d4","Type":"ContainerDied","Data":"1b1cc8585e13a887662f01305afa00ea83a4300fcc52b4fcfcf4fc153d0a7e77"} Dec 04 10:03:30 crc kubenswrapper[4693]: I1204 10:03:30.859666 4693 generic.go:334] "Generic (PLEG): container finished" podID="0f073022-a55b-4a76-8fbd-92df61f2d38b" containerID="c109a18000a463f941965facc7321c77423991e9abc308252989053addd9f995" exitCode=0 Dec 04 10:03:30 crc kubenswrapper[4693]: I1204 10:03:30.859755 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0f073022-a55b-4a76-8fbd-92df61f2d38b","Type":"ContainerDied","Data":"c109a18000a463f941965facc7321c77423991e9abc308252989053addd9f995"} Dec 04 10:03:30 crc kubenswrapper[4693]: I1204 10:03:30.862470 4693 generic.go:334] "Generic (PLEG): container finished" podID="07162790-3103-4a01-ba8d-fb948a5d57d4" containerID="b65883d367c8f1186f7f3ad60905460e478d907704620a5e51611a0fc156a9fc" exitCode=0 Dec 04 10:03:30 crc kubenswrapper[4693]: I1204 10:03:30.863025 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" event={"ID":"07162790-3103-4a01-ba8d-fb948a5d57d4","Type":"ContainerDied","Data":"b65883d367c8f1186f7f3ad60905460e478d907704620a5e51611a0fc156a9fc"} Dec 04 10:03:31 crc kubenswrapper[4693]: I1204 10:03:31.641515 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 04 10:03:31 crc kubenswrapper[4693]: I1204 10:03:31.735060 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.347054 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zczb4" podUID="ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a" containerName="ovn-controller" probeResult="failure" output=< Dec 04 10:03:33 crc kubenswrapper[4693]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 04 10:03:33 crc kubenswrapper[4693]: > Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.372451 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.375374 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7nh5c" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.596568 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zczb4-config-8vwrl"] Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.597742 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.601559 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.609544 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zczb4-config-8vwrl"] Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.747483 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-var-run-ovn\") pod \"ovn-controller-zczb4-config-8vwrl\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.747557 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-additional-scripts\") pod \"ovn-controller-zczb4-config-8vwrl\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.747595 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-scripts\") pod \"ovn-controller-zczb4-config-8vwrl\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.747622 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-var-log-ovn\") pod \"ovn-controller-zczb4-config-8vwrl\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.747722 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfc25\" (UniqueName: \"kubernetes.io/projected/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-kube-api-access-nfc25\") pod \"ovn-controller-zczb4-config-8vwrl\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.747902 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-var-run\") pod \"ovn-controller-zczb4-config-8vwrl\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.850031 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-scripts\") pod \"ovn-controller-zczb4-config-8vwrl\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.850072 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-var-log-ovn\") pod \"ovn-controller-zczb4-config-8vwrl\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.850151 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfc25\" (UniqueName: \"kubernetes.io/projected/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-kube-api-access-nfc25\") pod \"ovn-controller-zczb4-config-8vwrl\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.850185 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-var-run\") pod \"ovn-controller-zczb4-config-8vwrl\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.850250 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-var-run-ovn\") pod \"ovn-controller-zczb4-config-8vwrl\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.850266 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-additional-scripts\") pod \"ovn-controller-zczb4-config-8vwrl\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.850469 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-var-log-ovn\") pod \"ovn-controller-zczb4-config-8vwrl\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.850523 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-var-run\") pod \"ovn-controller-zczb4-config-8vwrl\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.850523 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-var-run-ovn\") pod \"ovn-controller-zczb4-config-8vwrl\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.851049 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-additional-scripts\") pod \"ovn-controller-zczb4-config-8vwrl\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.852735 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-scripts\") pod \"ovn-controller-zczb4-config-8vwrl\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.868643 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfc25\" (UniqueName: \"kubernetes.io/projected/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-kube-api-access-nfc25\") pod \"ovn-controller-zczb4-config-8vwrl\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:33 crc kubenswrapper[4693]: I1204 10:03:33.928547 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.141500 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.309319 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07162790-3103-4a01-ba8d-fb948a5d57d4-ovsdbserver-nb\") pod \"07162790-3103-4a01-ba8d-fb948a5d57d4\" (UID: \"07162790-3103-4a01-ba8d-fb948a5d57d4\") " Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.309796 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxb5b\" (UniqueName: \"kubernetes.io/projected/07162790-3103-4a01-ba8d-fb948a5d57d4-kube-api-access-hxb5b\") pod \"07162790-3103-4a01-ba8d-fb948a5d57d4\" (UID: \"07162790-3103-4a01-ba8d-fb948a5d57d4\") " Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.309837 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07162790-3103-4a01-ba8d-fb948a5d57d4-config\") pod \"07162790-3103-4a01-ba8d-fb948a5d57d4\" (UID: \"07162790-3103-4a01-ba8d-fb948a5d57d4\") " Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.309870 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07162790-3103-4a01-ba8d-fb948a5d57d4-dns-svc\") pod \"07162790-3103-4a01-ba8d-fb948a5d57d4\" (UID: \"07162790-3103-4a01-ba8d-fb948a5d57d4\") " Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.334049 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" podUID="07162790-3103-4a01-ba8d-fb948a5d57d4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.344681 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07162790-3103-4a01-ba8d-fb948a5d57d4-kube-api-access-hxb5b" (OuterVolumeSpecName: "kube-api-access-hxb5b") pod "07162790-3103-4a01-ba8d-fb948a5d57d4" (UID: "07162790-3103-4a01-ba8d-fb948a5d57d4"). InnerVolumeSpecName "kube-api-access-hxb5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.411778 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxb5b\" (UniqueName: \"kubernetes.io/projected/07162790-3103-4a01-ba8d-fb948a5d57d4-kube-api-access-hxb5b\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.461128 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07162790-3103-4a01-ba8d-fb948a5d57d4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "07162790-3103-4a01-ba8d-fb948a5d57d4" (UID: "07162790-3103-4a01-ba8d-fb948a5d57d4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.464095 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07162790-3103-4a01-ba8d-fb948a5d57d4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "07162790-3103-4a01-ba8d-fb948a5d57d4" (UID: "07162790-3103-4a01-ba8d-fb948a5d57d4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.471533 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07162790-3103-4a01-ba8d-fb948a5d57d4-config" (OuterVolumeSpecName: "config") pod "07162790-3103-4a01-ba8d-fb948a5d57d4" (UID: "07162790-3103-4a01-ba8d-fb948a5d57d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.513981 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.514205 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07162790-3103-4a01-ba8d-fb948a5d57d4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.514224 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07162790-3103-4a01-ba8d-fb948a5d57d4-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.514233 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07162790-3103-4a01-ba8d-fb948a5d57d4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:37 crc kubenswrapper[4693]: E1204 10:03:37.514393 4693 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 10:03:37 crc kubenswrapper[4693]: E1204 10:03:37.514524 4693 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 10:03:37 crc kubenswrapper[4693]: E1204 10:03:37.514596 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift podName:73554998-24a4-4d23-a78d-66d51cbe24af nodeName:}" failed. No retries permitted until 2025-12-04 10:04:09.514578375 +0000 UTC m=+1295.412172128 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift") pod "swift-storage-0" (UID: "73554998-24a4-4d23-a78d-66d51cbe24af") : configmap "swift-ring-files" not found Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.699804 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zczb4-config-8vwrl"] Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.919936 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0","Type":"ContainerStarted","Data":"f8f0ee2991648726ce5054404dd95493896185e56ad9c57ba50fc912c2b23966"} Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.920432 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"eec5c741-f1c6-424f-b3e1-4f5219fa0bf0","Type":"ContainerStarted","Data":"8847d36644908ded9d9fcefa3a3b4cecd83978126e6e56aa4f593b55e406a93c"} Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.920535 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.922057 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zczb4-config-8vwrl" event={"ID":"13c9b2d9-3a12-44e8-b407-4df7e721bd2c","Type":"ContainerStarted","Data":"d49c3dda1101235c56c263c4e96399beaf035a376764f00a2b83b26f202d3210"} Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.925090 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2d1a11f6-b003-41f8-a2f1-010d7dae29d4","Type":"ContainerStarted","Data":"390c00429d227816954ec973d7ff10025ee0a9bfa2b911619ec912362f25606b"} Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.925386 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.927886 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0f073022-a55b-4a76-8fbd-92df61f2d38b","Type":"ContainerStarted","Data":"0306fbcb7588457827f7161802a65836c7ffad8bc8a3fe27d04d7d4c853f8b83"} Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.928324 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.930180 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.930181 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-rc6f6" event={"ID":"07162790-3103-4a01-ba8d-fb948a5d57d4","Type":"ContainerDied","Data":"e68f839f0388973d7d39cd502166815c9ffeb735d2a40a9cbab86d3ebb2bf554"} Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.930256 4693 scope.go:117] "RemoveContainer" containerID="b65883d367c8f1186f7f3ad60905460e478d907704620a5e51611a0fc156a9fc" Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.931828 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xz7js" event={"ID":"39170f53-93c9-49fd-8dba-42d325269e74","Type":"ContainerStarted","Data":"1c9ebccfb99136aceae2e54e6fce75a38fc5e34d50a59200f02f59900eee5f2e"} Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.951485 4693 scope.go:117] "RemoveContainer" containerID="595682c9f9402872e9595de887c92e45c410e93c9025eb8ba4e70d9245474e4a" Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.952021 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.286642743 podStartE2EDuration="22.951999281s" podCreationTimestamp="2025-12-04 10:03:15 +0000 UTC" firstStartedPulling="2025-12-04 10:03:16.493427545 +0000 UTC m=+1242.391021298" lastFinishedPulling="2025-12-04 10:03:37.158784083 +0000 UTC m=+1263.056377836" observedRunningTime="2025-12-04 10:03:37.948997119 +0000 UTC m=+1263.846590862" watchObservedRunningTime="2025-12-04 10:03:37.951999281 +0000 UTC m=+1263.849593034" Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.987215 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=46.132289581 podStartE2EDuration="1m40.987189953s" podCreationTimestamp="2025-12-04 10:01:57 +0000 UTC" firstStartedPulling="2025-12-04 10:02:00.199693348 +0000 UTC m=+1166.097287101" lastFinishedPulling="2025-12-04 10:02:55.05459371 +0000 UTC m=+1220.952187473" observedRunningTime="2025-12-04 10:03:37.972742194 +0000 UTC m=+1263.870335947" watchObservedRunningTime="2025-12-04 10:03:37.987189953 +0000 UTC m=+1263.884783706" Dec 04 10:03:37 crc kubenswrapper[4693]: I1204 10:03:37.996191 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-xz7js" podStartSLOduration=2.714690667 podStartE2EDuration="28.99617242s" podCreationTimestamp="2025-12-04 10:03:09 +0000 UTC" firstStartedPulling="2025-12-04 10:03:10.907515554 +0000 UTC m=+1236.805109307" lastFinishedPulling="2025-12-04 10:03:37.188997307 +0000 UTC m=+1263.086591060" observedRunningTime="2025-12-04 10:03:37.990792313 +0000 UTC m=+1263.888386066" watchObservedRunningTime="2025-12-04 10:03:37.99617242 +0000 UTC m=+1263.893766173" Dec 04 10:03:38 crc kubenswrapper[4693]: I1204 10:03:38.020429 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=46.170832012 podStartE2EDuration="1m41.02041478s" podCreationTimestamp="2025-12-04 10:01:57 +0000 UTC" firstStartedPulling="2025-12-04 10:02:00.20310089 +0000 UTC m=+1166.100694633" lastFinishedPulling="2025-12-04 10:02:55.052683648 +0000 UTC m=+1220.950277401" observedRunningTime="2025-12-04 10:03:38.013000495 +0000 UTC m=+1263.910594258" watchObservedRunningTime="2025-12-04 10:03:38.02041478 +0000 UTC m=+1263.918008533" Dec 04 10:03:38 crc kubenswrapper[4693]: I1204 10:03:38.040560 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-rc6f6"] Dec 04 10:03:38 crc kubenswrapper[4693]: I1204 10:03:38.051822 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-rc6f6"] Dec 04 10:03:38 crc kubenswrapper[4693]: I1204 10:03:38.365884 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-zczb4" Dec 04 10:03:38 crc kubenswrapper[4693]: I1204 10:03:38.470371 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07162790-3103-4a01-ba8d-fb948a5d57d4" path="/var/lib/kubelet/pods/07162790-3103-4a01-ba8d-fb948a5d57d4/volumes" Dec 04 10:03:38 crc kubenswrapper[4693]: I1204 10:03:38.945737 4693 generic.go:334] "Generic (PLEG): container finished" podID="13c9b2d9-3a12-44e8-b407-4df7e721bd2c" containerID="fad32760f70ee8315597994a591cd775d292bf8360d7a15ed9ede88765fdf706" exitCode=0 Dec 04 10:03:38 crc kubenswrapper[4693]: I1204 10:03:38.946384 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zczb4-config-8vwrl" event={"ID":"13c9b2d9-3a12-44e8-b407-4df7e721bd2c","Type":"ContainerDied","Data":"fad32760f70ee8315597994a591cd775d292bf8360d7a15ed9ede88765fdf706"} Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.285766 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.406846 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-var-log-ovn\") pod \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.406898 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-var-run-ovn\") pod \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.406944 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-var-run\") pod \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.406969 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "13c9b2d9-3a12-44e8-b407-4df7e721bd2c" (UID: "13c9b2d9-3a12-44e8-b407-4df7e721bd2c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.407022 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-scripts\") pod \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.407060 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-var-run" (OuterVolumeSpecName: "var-run") pod "13c9b2d9-3a12-44e8-b407-4df7e721bd2c" (UID: "13c9b2d9-3a12-44e8-b407-4df7e721bd2c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.407090 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "13c9b2d9-3a12-44e8-b407-4df7e721bd2c" (UID: "13c9b2d9-3a12-44e8-b407-4df7e721bd2c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.407119 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfc25\" (UniqueName: \"kubernetes.io/projected/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-kube-api-access-nfc25\") pod \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.407264 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-additional-scripts\") pod \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\" (UID: \"13c9b2d9-3a12-44e8-b407-4df7e721bd2c\") " Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.407961 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "13c9b2d9-3a12-44e8-b407-4df7e721bd2c" (UID: "13c9b2d9-3a12-44e8-b407-4df7e721bd2c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.408106 4693 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.408133 4693 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.408146 4693 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-var-run\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.408158 4693 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.408356 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-scripts" (OuterVolumeSpecName: "scripts") pod "13c9b2d9-3a12-44e8-b407-4df7e721bd2c" (UID: "13c9b2d9-3a12-44e8-b407-4df7e721bd2c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.414266 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-kube-api-access-nfc25" (OuterVolumeSpecName: "kube-api-access-nfc25") pod "13c9b2d9-3a12-44e8-b407-4df7e721bd2c" (UID: "13c9b2d9-3a12-44e8-b407-4df7e721bd2c"). InnerVolumeSpecName "kube-api-access-nfc25". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.509854 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.510144 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfc25\" (UniqueName: \"kubernetes.io/projected/13c9b2d9-3a12-44e8-b407-4df7e721bd2c-kube-api-access-nfc25\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.780720 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.873212 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.964601 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zczb4-config-8vwrl" event={"ID":"13c9b2d9-3a12-44e8-b407-4df7e721bd2c","Type":"ContainerDied","Data":"d49c3dda1101235c56c263c4e96399beaf035a376764f00a2b83b26f202d3210"} Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.964645 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d49c3dda1101235c56c263c4e96399beaf035a376764f00a2b83b26f202d3210" Dec 04 10:03:40 crc kubenswrapper[4693]: I1204 10:03:40.964967 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zczb4-config-8vwrl" Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.416573 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-zczb4-config-8vwrl"] Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.426258 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-zczb4-config-8vwrl"] Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.720844 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-84c8-account-create-update-wmv28"] Dec 04 10:03:41 crc kubenswrapper[4693]: E1204 10:03:41.721204 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07162790-3103-4a01-ba8d-fb948a5d57d4" containerName="dnsmasq-dns" Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.721223 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="07162790-3103-4a01-ba8d-fb948a5d57d4" containerName="dnsmasq-dns" Dec 04 10:03:41 crc kubenswrapper[4693]: E1204 10:03:41.721246 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c9b2d9-3a12-44e8-b407-4df7e721bd2c" containerName="ovn-config" Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.721254 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c9b2d9-3a12-44e8-b407-4df7e721bd2c" containerName="ovn-config" Dec 04 10:03:41 crc kubenswrapper[4693]: E1204 10:03:41.721273 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07162790-3103-4a01-ba8d-fb948a5d57d4" containerName="init" Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.721281 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="07162790-3103-4a01-ba8d-fb948a5d57d4" containerName="init" Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.721488 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="07162790-3103-4a01-ba8d-fb948a5d57d4" containerName="dnsmasq-dns" Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.721505 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c9b2d9-3a12-44e8-b407-4df7e721bd2c" containerName="ovn-config" Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.722021 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-84c8-account-create-update-wmv28" Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.724295 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.775823 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-84c8-account-create-update-wmv28"] Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.813476 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-ck5pq"] Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.814741 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ck5pq" Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.828454 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ck5pq"] Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.830358 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7f9fe71-06d0-4075-a262-74050f6b73d7-operator-scripts\") pod \"keystone-84c8-account-create-update-wmv28\" (UID: \"c7f9fe71-06d0-4075-a262-74050f6b73d7\") " pod="openstack/keystone-84c8-account-create-update-wmv28" Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.830396 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bdmq\" (UniqueName: \"kubernetes.io/projected/c7f9fe71-06d0-4075-a262-74050f6b73d7-kube-api-access-4bdmq\") pod \"keystone-84c8-account-create-update-wmv28\" (UID: \"c7f9fe71-06d0-4075-a262-74050f6b73d7\") " pod="openstack/keystone-84c8-account-create-update-wmv28" Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.932468 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls4d6\" (UniqueName: \"kubernetes.io/projected/df7aea09-5790-4fda-9a37-8ada0326c2d0-kube-api-access-ls4d6\") pod \"keystone-db-create-ck5pq\" (UID: \"df7aea09-5790-4fda-9a37-8ada0326c2d0\") " pod="openstack/keystone-db-create-ck5pq" Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.932665 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df7aea09-5790-4fda-9a37-8ada0326c2d0-operator-scripts\") pod \"keystone-db-create-ck5pq\" (UID: \"df7aea09-5790-4fda-9a37-8ada0326c2d0\") " pod="openstack/keystone-db-create-ck5pq" Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.932805 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7f9fe71-06d0-4075-a262-74050f6b73d7-operator-scripts\") pod \"keystone-84c8-account-create-update-wmv28\" (UID: \"c7f9fe71-06d0-4075-a262-74050f6b73d7\") " pod="openstack/keystone-84c8-account-create-update-wmv28" Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.932852 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bdmq\" (UniqueName: \"kubernetes.io/projected/c7f9fe71-06d0-4075-a262-74050f6b73d7-kube-api-access-4bdmq\") pod \"keystone-84c8-account-create-update-wmv28\" (UID: \"c7f9fe71-06d0-4075-a262-74050f6b73d7\") " pod="openstack/keystone-84c8-account-create-update-wmv28" Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.933729 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7f9fe71-06d0-4075-a262-74050f6b73d7-operator-scripts\") pod \"keystone-84c8-account-create-update-wmv28\" (UID: \"c7f9fe71-06d0-4075-a262-74050f6b73d7\") " pod="openstack/keystone-84c8-account-create-update-wmv28" Dec 04 10:03:41 crc kubenswrapper[4693]: I1204 10:03:41.969883 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bdmq\" (UniqueName: \"kubernetes.io/projected/c7f9fe71-06d0-4075-a262-74050f6b73d7-kube-api-access-4bdmq\") pod \"keystone-84c8-account-create-update-wmv28\" (UID: \"c7f9fe71-06d0-4075-a262-74050f6b73d7\") " pod="openstack/keystone-84c8-account-create-update-wmv28" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.034075 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls4d6\" (UniqueName: \"kubernetes.io/projected/df7aea09-5790-4fda-9a37-8ada0326c2d0-kube-api-access-ls4d6\") pod \"keystone-db-create-ck5pq\" (UID: \"df7aea09-5790-4fda-9a37-8ada0326c2d0\") " pod="openstack/keystone-db-create-ck5pq" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.034167 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df7aea09-5790-4fda-9a37-8ada0326c2d0-operator-scripts\") pod \"keystone-db-create-ck5pq\" (UID: \"df7aea09-5790-4fda-9a37-8ada0326c2d0\") " pod="openstack/keystone-db-create-ck5pq" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.034832 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df7aea09-5790-4fda-9a37-8ada0326c2d0-operator-scripts\") pod \"keystone-db-create-ck5pq\" (UID: \"df7aea09-5790-4fda-9a37-8ada0326c2d0\") " pod="openstack/keystone-db-create-ck5pq" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.035680 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-h22jp"] Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.036884 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-h22jp" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.047076 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-84c8-account-create-update-wmv28" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.056341 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-h22jp"] Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.064557 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6716-account-create-update-v4sm8"] Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.066003 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6716-account-create-update-v4sm8" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.068780 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.071870 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls4d6\" (UniqueName: \"kubernetes.io/projected/df7aea09-5790-4fda-9a37-8ada0326c2d0-kube-api-access-ls4d6\") pod \"keystone-db-create-ck5pq\" (UID: \"df7aea09-5790-4fda-9a37-8ada0326c2d0\") " pod="openstack/keystone-db-create-ck5pq" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.079687 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6716-account-create-update-v4sm8"] Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.129670 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ck5pq" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.137019 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t27c2\" (UniqueName: \"kubernetes.io/projected/561053fe-0024-4f25-bfab-94a8e139ac06-kube-api-access-t27c2\") pod \"placement-db-create-h22jp\" (UID: \"561053fe-0024-4f25-bfab-94a8e139ac06\") " pod="openstack/placement-db-create-h22jp" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.137113 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb7xs\" (UniqueName: \"kubernetes.io/projected/b1e98765-1b21-4b1a-80be-e5dc19d13082-kube-api-access-gb7xs\") pod \"placement-6716-account-create-update-v4sm8\" (UID: \"b1e98765-1b21-4b1a-80be-e5dc19d13082\") " pod="openstack/placement-6716-account-create-update-v4sm8" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.137204 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561053fe-0024-4f25-bfab-94a8e139ac06-operator-scripts\") pod \"placement-db-create-h22jp\" (UID: \"561053fe-0024-4f25-bfab-94a8e139ac06\") " pod="openstack/placement-db-create-h22jp" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.137459 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1e98765-1b21-4b1a-80be-e5dc19d13082-operator-scripts\") pod \"placement-6716-account-create-update-v4sm8\" (UID: \"b1e98765-1b21-4b1a-80be-e5dc19d13082\") " pod="openstack/placement-6716-account-create-update-v4sm8" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.224256 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-gb6kd"] Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.225592 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gb6kd" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.236950 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gb6kd"] Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.238624 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1e98765-1b21-4b1a-80be-e5dc19d13082-operator-scripts\") pod \"placement-6716-account-create-update-v4sm8\" (UID: \"b1e98765-1b21-4b1a-80be-e5dc19d13082\") " pod="openstack/placement-6716-account-create-update-v4sm8" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.238701 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t27c2\" (UniqueName: \"kubernetes.io/projected/561053fe-0024-4f25-bfab-94a8e139ac06-kube-api-access-t27c2\") pod \"placement-db-create-h22jp\" (UID: \"561053fe-0024-4f25-bfab-94a8e139ac06\") " pod="openstack/placement-db-create-h22jp" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.238742 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb7xs\" (UniqueName: \"kubernetes.io/projected/b1e98765-1b21-4b1a-80be-e5dc19d13082-kube-api-access-gb7xs\") pod \"placement-6716-account-create-update-v4sm8\" (UID: \"b1e98765-1b21-4b1a-80be-e5dc19d13082\") " pod="openstack/placement-6716-account-create-update-v4sm8" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.238765 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561053fe-0024-4f25-bfab-94a8e139ac06-operator-scripts\") pod \"placement-db-create-h22jp\" (UID: \"561053fe-0024-4f25-bfab-94a8e139ac06\") " pod="openstack/placement-db-create-h22jp" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.239471 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561053fe-0024-4f25-bfab-94a8e139ac06-operator-scripts\") pod \"placement-db-create-h22jp\" (UID: \"561053fe-0024-4f25-bfab-94a8e139ac06\") " pod="openstack/placement-db-create-h22jp" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.239934 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1e98765-1b21-4b1a-80be-e5dc19d13082-operator-scripts\") pod \"placement-6716-account-create-update-v4sm8\" (UID: \"b1e98765-1b21-4b1a-80be-e5dc19d13082\") " pod="openstack/placement-6716-account-create-update-v4sm8" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.264740 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb7xs\" (UniqueName: \"kubernetes.io/projected/b1e98765-1b21-4b1a-80be-e5dc19d13082-kube-api-access-gb7xs\") pod \"placement-6716-account-create-update-v4sm8\" (UID: \"b1e98765-1b21-4b1a-80be-e5dc19d13082\") " pod="openstack/placement-6716-account-create-update-v4sm8" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.269135 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t27c2\" (UniqueName: \"kubernetes.io/projected/561053fe-0024-4f25-bfab-94a8e139ac06-kube-api-access-t27c2\") pod \"placement-db-create-h22jp\" (UID: \"561053fe-0024-4f25-bfab-94a8e139ac06\") " pod="openstack/placement-db-create-h22jp" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.360825 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh44x\" (UniqueName: \"kubernetes.io/projected/9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75-kube-api-access-zh44x\") pod \"glance-db-create-gb6kd\" (UID: \"9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75\") " pod="openstack/glance-db-create-gb6kd" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.360915 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75-operator-scripts\") pod \"glance-db-create-gb6kd\" (UID: \"9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75\") " pod="openstack/glance-db-create-gb6kd" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.362741 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-h22jp" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.375503 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-ba22-account-create-update-vzw95"] Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.380026 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-ba22-account-create-update-vzw95"] Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.380117 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ba22-account-create-update-vzw95" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.382630 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.463301 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14f8a950-a259-4279-99e9-d33a4fc93e7d-operator-scripts\") pod \"glance-ba22-account-create-update-vzw95\" (UID: \"14f8a950-a259-4279-99e9-d33a4fc93e7d\") " pod="openstack/glance-ba22-account-create-update-vzw95" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.463386 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phr2k\" (UniqueName: \"kubernetes.io/projected/14f8a950-a259-4279-99e9-d33a4fc93e7d-kube-api-access-phr2k\") pod \"glance-ba22-account-create-update-vzw95\" (UID: \"14f8a950-a259-4279-99e9-d33a4fc93e7d\") " pod="openstack/glance-ba22-account-create-update-vzw95" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.463425 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh44x\" (UniqueName: \"kubernetes.io/projected/9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75-kube-api-access-zh44x\") pod \"glance-db-create-gb6kd\" (UID: \"9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75\") " pod="openstack/glance-db-create-gb6kd" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.463697 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75-operator-scripts\") pod \"glance-db-create-gb6kd\" (UID: \"9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75\") " pod="openstack/glance-db-create-gb6kd" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.464448 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75-operator-scripts\") pod \"glance-db-create-gb6kd\" (UID: \"9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75\") " pod="openstack/glance-db-create-gb6kd" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.471860 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13c9b2d9-3a12-44e8-b407-4df7e721bd2c" path="/var/lib/kubelet/pods/13c9b2d9-3a12-44e8-b407-4df7e721bd2c/volumes" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.493820 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh44x\" (UniqueName: \"kubernetes.io/projected/9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75-kube-api-access-zh44x\") pod \"glance-db-create-gb6kd\" (UID: \"9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75\") " pod="openstack/glance-db-create-gb6kd" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.525763 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6716-account-create-update-v4sm8" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.565105 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14f8a950-a259-4279-99e9-d33a4fc93e7d-operator-scripts\") pod \"glance-ba22-account-create-update-vzw95\" (UID: \"14f8a950-a259-4279-99e9-d33a4fc93e7d\") " pod="openstack/glance-ba22-account-create-update-vzw95" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.565204 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phr2k\" (UniqueName: \"kubernetes.io/projected/14f8a950-a259-4279-99e9-d33a4fc93e7d-kube-api-access-phr2k\") pod \"glance-ba22-account-create-update-vzw95\" (UID: \"14f8a950-a259-4279-99e9-d33a4fc93e7d\") " pod="openstack/glance-ba22-account-create-update-vzw95" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.567682 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14f8a950-a259-4279-99e9-d33a4fc93e7d-operator-scripts\") pod \"glance-ba22-account-create-update-vzw95\" (UID: \"14f8a950-a259-4279-99e9-d33a4fc93e7d\") " pod="openstack/glance-ba22-account-create-update-vzw95" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.585259 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gb6kd" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.585747 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phr2k\" (UniqueName: \"kubernetes.io/projected/14f8a950-a259-4279-99e9-d33a4fc93e7d-kube-api-access-phr2k\") pod \"glance-ba22-account-create-update-vzw95\" (UID: \"14f8a950-a259-4279-99e9-d33a4fc93e7d\") " pod="openstack/glance-ba22-account-create-update-vzw95" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.704463 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ck5pq"] Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.708746 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ba22-account-create-update-vzw95" Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.710149 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-h22jp"] Dec 04 10:03:42 crc kubenswrapper[4693]: I1204 10:03:42.721225 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-84c8-account-create-update-wmv28"] Dec 04 10:03:42 crc kubenswrapper[4693]: W1204 10:03:42.722996 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7f9fe71_06d0_4075_a262_74050f6b73d7.slice/crio-6e6122dde38e99baf7021d7896db5bdf119404aeab361491fd870088f75166bc WatchSource:0}: Error finding container 6e6122dde38e99baf7021d7896db5bdf119404aeab361491fd870088f75166bc: Status 404 returned error can't find the container with id 6e6122dde38e99baf7021d7896db5bdf119404aeab361491fd870088f75166bc Dec 04 10:03:43 crc kubenswrapper[4693]: I1204 10:03:43.018562 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-84c8-account-create-update-wmv28" event={"ID":"c7f9fe71-06d0-4075-a262-74050f6b73d7","Type":"ContainerStarted","Data":"6e6122dde38e99baf7021d7896db5bdf119404aeab361491fd870088f75166bc"} Dec 04 10:03:43 crc kubenswrapper[4693]: I1204 10:03:43.030824 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-h22jp" event={"ID":"561053fe-0024-4f25-bfab-94a8e139ac06","Type":"ContainerStarted","Data":"f4ad3dd47ebf20ce30af94306af7e5a1c5d2d6fbbddb1fe8e7cfea4d40bf0051"} Dec 04 10:03:43 crc kubenswrapper[4693]: I1204 10:03:43.044457 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ck5pq" event={"ID":"df7aea09-5790-4fda-9a37-8ada0326c2d0","Type":"ContainerStarted","Data":"a7db98fef6b424e037733a52110d9796283992511596cfa90371cb52cf77b9fa"} Dec 04 10:03:43 crc kubenswrapper[4693]: I1204 10:03:43.060554 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-ba22-account-create-update-vzw95"] Dec 04 10:03:43 crc kubenswrapper[4693]: I1204 10:03:43.079495 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6716-account-create-update-v4sm8"] Dec 04 10:03:43 crc kubenswrapper[4693]: W1204 10:03:43.094497 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1e98765_1b21_4b1a_80be_e5dc19d13082.slice/crio-75368d09ab5290e72eb557d2271bbea05ef7cf9ab50b4a913ddf6c8d8a2dc3c5 WatchSource:0}: Error finding container 75368d09ab5290e72eb557d2271bbea05ef7cf9ab50b4a913ddf6c8d8a2dc3c5: Status 404 returned error can't find the container with id 75368d09ab5290e72eb557d2271bbea05ef7cf9ab50b4a913ddf6c8d8a2dc3c5 Dec 04 10:03:43 crc kubenswrapper[4693]: I1204 10:03:43.166298 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gb6kd"] Dec 04 10:03:43 crc kubenswrapper[4693]: W1204 10:03:43.177772 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ebe1ff2_6f41_4f5b_a346_5664fc8c1d75.slice/crio-239529ea14035eb26045c2bf26d35e9b8417ad5828a1541ba9fbd741b2d38175 WatchSource:0}: Error finding container 239529ea14035eb26045c2bf26d35e9b8417ad5828a1541ba9fbd741b2d38175: Status 404 returned error can't find the container with id 239529ea14035eb26045c2bf26d35e9b8417ad5828a1541ba9fbd741b2d38175 Dec 04 10:03:44 crc kubenswrapper[4693]: I1204 10:03:44.057323 4693 generic.go:334] "Generic (PLEG): container finished" podID="9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75" containerID="83e70c585c53621a59ee096045bbd9feccbef3d920036abb57870ca78e08a542" exitCode=0 Dec 04 10:03:44 crc kubenswrapper[4693]: I1204 10:03:44.057455 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gb6kd" event={"ID":"9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75","Type":"ContainerDied","Data":"83e70c585c53621a59ee096045bbd9feccbef3d920036abb57870ca78e08a542"} Dec 04 10:03:44 crc kubenswrapper[4693]: I1204 10:03:44.057799 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gb6kd" event={"ID":"9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75","Type":"ContainerStarted","Data":"239529ea14035eb26045c2bf26d35e9b8417ad5828a1541ba9fbd741b2d38175"} Dec 04 10:03:44 crc kubenswrapper[4693]: I1204 10:03:44.059162 4693 generic.go:334] "Generic (PLEG): container finished" podID="df7aea09-5790-4fda-9a37-8ada0326c2d0" containerID="1da0bda23c5533ab29df254eb9925799aa521a038d9779a2295fedb2af048e9c" exitCode=0 Dec 04 10:03:44 crc kubenswrapper[4693]: I1204 10:03:44.059231 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ck5pq" event={"ID":"df7aea09-5790-4fda-9a37-8ada0326c2d0","Type":"ContainerDied","Data":"1da0bda23c5533ab29df254eb9925799aa521a038d9779a2295fedb2af048e9c"} Dec 04 10:03:44 crc kubenswrapper[4693]: I1204 10:03:44.060621 4693 generic.go:334] "Generic (PLEG): container finished" podID="14f8a950-a259-4279-99e9-d33a4fc93e7d" containerID="3e3f5a6d15a1b4477625cf1b05c6de6234542606e032cec47ed7a06b9f1e8a44" exitCode=0 Dec 04 10:03:44 crc kubenswrapper[4693]: I1204 10:03:44.060671 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ba22-account-create-update-vzw95" event={"ID":"14f8a950-a259-4279-99e9-d33a4fc93e7d","Type":"ContainerDied","Data":"3e3f5a6d15a1b4477625cf1b05c6de6234542606e032cec47ed7a06b9f1e8a44"} Dec 04 10:03:44 crc kubenswrapper[4693]: I1204 10:03:44.060692 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ba22-account-create-update-vzw95" event={"ID":"14f8a950-a259-4279-99e9-d33a4fc93e7d","Type":"ContainerStarted","Data":"7dc66b5c86dd5ae68e1292e7266361cf2b2ea8a6d42ef5cc6f95d54d1ba21d03"} Dec 04 10:03:44 crc kubenswrapper[4693]: I1204 10:03:44.062164 4693 generic.go:334] "Generic (PLEG): container finished" podID="c7f9fe71-06d0-4075-a262-74050f6b73d7" containerID="31a4a898c67dbbd2395f3445f8b9591483d46d43dcd95e79b00268b365e55d7b" exitCode=0 Dec 04 10:03:44 crc kubenswrapper[4693]: I1204 10:03:44.062241 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-84c8-account-create-update-wmv28" event={"ID":"c7f9fe71-06d0-4075-a262-74050f6b73d7","Type":"ContainerDied","Data":"31a4a898c67dbbd2395f3445f8b9591483d46d43dcd95e79b00268b365e55d7b"} Dec 04 10:03:44 crc kubenswrapper[4693]: I1204 10:03:44.063857 4693 generic.go:334] "Generic (PLEG): container finished" podID="561053fe-0024-4f25-bfab-94a8e139ac06" containerID="69327e0323421578e04dda6f967a1f364d5e266f184aad070cddcc2470dc1de2" exitCode=0 Dec 04 10:03:44 crc kubenswrapper[4693]: I1204 10:03:44.063906 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-h22jp" event={"ID":"561053fe-0024-4f25-bfab-94a8e139ac06","Type":"ContainerDied","Data":"69327e0323421578e04dda6f967a1f364d5e266f184aad070cddcc2470dc1de2"} Dec 04 10:03:44 crc kubenswrapper[4693]: I1204 10:03:44.065178 4693 generic.go:334] "Generic (PLEG): container finished" podID="b1e98765-1b21-4b1a-80be-e5dc19d13082" containerID="6766b86962fb54c643e411c42447b10e6ebf764c420d665a426178b94907f5f9" exitCode=0 Dec 04 10:03:44 crc kubenswrapper[4693]: I1204 10:03:44.065215 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6716-account-create-update-v4sm8" event={"ID":"b1e98765-1b21-4b1a-80be-e5dc19d13082","Type":"ContainerDied","Data":"6766b86962fb54c643e411c42447b10e6ebf764c420d665a426178b94907f5f9"} Dec 04 10:03:44 crc kubenswrapper[4693]: I1204 10:03:44.065242 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6716-account-create-update-v4sm8" event={"ID":"b1e98765-1b21-4b1a-80be-e5dc19d13082","Type":"ContainerStarted","Data":"75368d09ab5290e72eb557d2271bbea05ef7cf9ab50b4a913ddf6c8d8a2dc3c5"} Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.869082 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gb6kd" Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.875466 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ba22-account-create-update-vzw95" Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.882779 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-84c8-account-create-update-wmv28" Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.890235 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6716-account-create-update-v4sm8" Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.959482 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14f8a950-a259-4279-99e9-d33a4fc93e7d-operator-scripts\") pod \"14f8a950-a259-4279-99e9-d33a4fc93e7d\" (UID: \"14f8a950-a259-4279-99e9-d33a4fc93e7d\") " Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.959548 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1e98765-1b21-4b1a-80be-e5dc19d13082-operator-scripts\") pod \"b1e98765-1b21-4b1a-80be-e5dc19d13082\" (UID: \"b1e98765-1b21-4b1a-80be-e5dc19d13082\") " Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.959685 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb7xs\" (UniqueName: \"kubernetes.io/projected/b1e98765-1b21-4b1a-80be-e5dc19d13082-kube-api-access-gb7xs\") pod \"b1e98765-1b21-4b1a-80be-e5dc19d13082\" (UID: \"b1e98765-1b21-4b1a-80be-e5dc19d13082\") " Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.959755 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75-operator-scripts\") pod \"9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75\" (UID: \"9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75\") " Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.959796 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bdmq\" (UniqueName: \"kubernetes.io/projected/c7f9fe71-06d0-4075-a262-74050f6b73d7-kube-api-access-4bdmq\") pod \"c7f9fe71-06d0-4075-a262-74050f6b73d7\" (UID: \"c7f9fe71-06d0-4075-a262-74050f6b73d7\") " Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.959834 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phr2k\" (UniqueName: \"kubernetes.io/projected/14f8a950-a259-4279-99e9-d33a4fc93e7d-kube-api-access-phr2k\") pod \"14f8a950-a259-4279-99e9-d33a4fc93e7d\" (UID: \"14f8a950-a259-4279-99e9-d33a4fc93e7d\") " Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.959873 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh44x\" (UniqueName: \"kubernetes.io/projected/9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75-kube-api-access-zh44x\") pod \"9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75\" (UID: \"9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75\") " Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.959893 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7f9fe71-06d0-4075-a262-74050f6b73d7-operator-scripts\") pod \"c7f9fe71-06d0-4075-a262-74050f6b73d7\" (UID: \"c7f9fe71-06d0-4075-a262-74050f6b73d7\") " Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.961004 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7f9fe71-06d0-4075-a262-74050f6b73d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7f9fe71-06d0-4075-a262-74050f6b73d7" (UID: "c7f9fe71-06d0-4075-a262-74050f6b73d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.961420 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14f8a950-a259-4279-99e9-d33a4fc93e7d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "14f8a950-a259-4279-99e9-d33a4fc93e7d" (UID: "14f8a950-a259-4279-99e9-d33a4fc93e7d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.961786 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1e98765-1b21-4b1a-80be-e5dc19d13082-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1e98765-1b21-4b1a-80be-e5dc19d13082" (UID: "b1e98765-1b21-4b1a-80be-e5dc19d13082"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.962907 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75" (UID: "9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.967348 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1e98765-1b21-4b1a-80be-e5dc19d13082-kube-api-access-gb7xs" (OuterVolumeSpecName: "kube-api-access-gb7xs") pod "b1e98765-1b21-4b1a-80be-e5dc19d13082" (UID: "b1e98765-1b21-4b1a-80be-e5dc19d13082"). InnerVolumeSpecName "kube-api-access-gb7xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.967385 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7f9fe71-06d0-4075-a262-74050f6b73d7-kube-api-access-4bdmq" (OuterVolumeSpecName: "kube-api-access-4bdmq") pod "c7f9fe71-06d0-4075-a262-74050f6b73d7" (UID: "c7f9fe71-06d0-4075-a262-74050f6b73d7"). InnerVolumeSpecName "kube-api-access-4bdmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.967628 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14f8a950-a259-4279-99e9-d33a4fc93e7d-kube-api-access-phr2k" (OuterVolumeSpecName: "kube-api-access-phr2k") pod "14f8a950-a259-4279-99e9-d33a4fc93e7d" (UID: "14f8a950-a259-4279-99e9-d33a4fc93e7d"). InnerVolumeSpecName "kube-api-access-phr2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:45 crc kubenswrapper[4693]: I1204 10:03:45.968176 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75-kube-api-access-zh44x" (OuterVolumeSpecName: "kube-api-access-zh44x") pod "9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75" (UID: "9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75"). InnerVolumeSpecName "kube-api-access-zh44x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.026824 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ck5pq" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.033247 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-h22jp" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.061055 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t27c2\" (UniqueName: \"kubernetes.io/projected/561053fe-0024-4f25-bfab-94a8e139ac06-kube-api-access-t27c2\") pod \"561053fe-0024-4f25-bfab-94a8e139ac06\" (UID: \"561053fe-0024-4f25-bfab-94a8e139ac06\") " Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.061164 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561053fe-0024-4f25-bfab-94a8e139ac06-operator-scripts\") pod \"561053fe-0024-4f25-bfab-94a8e139ac06\" (UID: \"561053fe-0024-4f25-bfab-94a8e139ac06\") " Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.061207 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df7aea09-5790-4fda-9a37-8ada0326c2d0-operator-scripts\") pod \"df7aea09-5790-4fda-9a37-8ada0326c2d0\" (UID: \"df7aea09-5790-4fda-9a37-8ada0326c2d0\") " Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.061272 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls4d6\" (UniqueName: \"kubernetes.io/projected/df7aea09-5790-4fda-9a37-8ada0326c2d0-kube-api-access-ls4d6\") pod \"df7aea09-5790-4fda-9a37-8ada0326c2d0\" (UID: \"df7aea09-5790-4fda-9a37-8ada0326c2d0\") " Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.061766 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14f8a950-a259-4279-99e9-d33a4fc93e7d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.061796 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1e98765-1b21-4b1a-80be-e5dc19d13082-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.061812 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb7xs\" (UniqueName: \"kubernetes.io/projected/b1e98765-1b21-4b1a-80be-e5dc19d13082-kube-api-access-gb7xs\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.061826 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.061838 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bdmq\" (UniqueName: \"kubernetes.io/projected/c7f9fe71-06d0-4075-a262-74050f6b73d7-kube-api-access-4bdmq\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.061851 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phr2k\" (UniqueName: \"kubernetes.io/projected/14f8a950-a259-4279-99e9-d33a4fc93e7d-kube-api-access-phr2k\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.061863 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh44x\" (UniqueName: \"kubernetes.io/projected/9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75-kube-api-access-zh44x\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.061875 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7f9fe71-06d0-4075-a262-74050f6b73d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.062148 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df7aea09-5790-4fda-9a37-8ada0326c2d0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "df7aea09-5790-4fda-9a37-8ada0326c2d0" (UID: "df7aea09-5790-4fda-9a37-8ada0326c2d0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.062257 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/561053fe-0024-4f25-bfab-94a8e139ac06-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "561053fe-0024-4f25-bfab-94a8e139ac06" (UID: "561053fe-0024-4f25-bfab-94a8e139ac06"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.064298 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/561053fe-0024-4f25-bfab-94a8e139ac06-kube-api-access-t27c2" (OuterVolumeSpecName: "kube-api-access-t27c2") pod "561053fe-0024-4f25-bfab-94a8e139ac06" (UID: "561053fe-0024-4f25-bfab-94a8e139ac06"). InnerVolumeSpecName "kube-api-access-t27c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.064940 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df7aea09-5790-4fda-9a37-8ada0326c2d0-kube-api-access-ls4d6" (OuterVolumeSpecName: "kube-api-access-ls4d6") pod "df7aea09-5790-4fda-9a37-8ada0326c2d0" (UID: "df7aea09-5790-4fda-9a37-8ada0326c2d0"). InnerVolumeSpecName "kube-api-access-ls4d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.099296 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6716-account-create-update-v4sm8" event={"ID":"b1e98765-1b21-4b1a-80be-e5dc19d13082","Type":"ContainerDied","Data":"75368d09ab5290e72eb557d2271bbea05ef7cf9ab50b4a913ddf6c8d8a2dc3c5"} Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.099383 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75368d09ab5290e72eb557d2271bbea05ef7cf9ab50b4a913ddf6c8d8a2dc3c5" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.099968 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6716-account-create-update-v4sm8" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.102752 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gb6kd" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.102754 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gb6kd" event={"ID":"9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75","Type":"ContainerDied","Data":"239529ea14035eb26045c2bf26d35e9b8417ad5828a1541ba9fbd741b2d38175"} Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.102870 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="239529ea14035eb26045c2bf26d35e9b8417ad5828a1541ba9fbd741b2d38175" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.109276 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ck5pq" event={"ID":"df7aea09-5790-4fda-9a37-8ada0326c2d0","Type":"ContainerDied","Data":"a7db98fef6b424e037733a52110d9796283992511596cfa90371cb52cf77b9fa"} Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.109305 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ck5pq" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.109320 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7db98fef6b424e037733a52110d9796283992511596cfa90371cb52cf77b9fa" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.111522 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ba22-account-create-update-vzw95" event={"ID":"14f8a950-a259-4279-99e9-d33a4fc93e7d","Type":"ContainerDied","Data":"7dc66b5c86dd5ae68e1292e7266361cf2b2ea8a6d42ef5cc6f95d54d1ba21d03"} Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.111548 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dc66b5c86dd5ae68e1292e7266361cf2b2ea8a6d42ef5cc6f95d54d1ba21d03" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.111648 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ba22-account-create-update-vzw95" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.114125 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-84c8-account-create-update-wmv28" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.114056 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-84c8-account-create-update-wmv28" event={"ID":"c7f9fe71-06d0-4075-a262-74050f6b73d7","Type":"ContainerDied","Data":"6e6122dde38e99baf7021d7896db5bdf119404aeab361491fd870088f75166bc"} Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.115077 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e6122dde38e99baf7021d7896db5bdf119404aeab361491fd870088f75166bc" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.116108 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-h22jp" event={"ID":"561053fe-0024-4f25-bfab-94a8e139ac06","Type":"ContainerDied","Data":"f4ad3dd47ebf20ce30af94306af7e5a1c5d2d6fbbddb1fe8e7cfea4d40bf0051"} Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.116169 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-h22jp" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.116549 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4ad3dd47ebf20ce30af94306af7e5a1c5d2d6fbbddb1fe8e7cfea4d40bf0051" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.163841 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t27c2\" (UniqueName: \"kubernetes.io/projected/561053fe-0024-4f25-bfab-94a8e139ac06-kube-api-access-t27c2\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.163905 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561053fe-0024-4f25-bfab-94a8e139ac06-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.163920 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df7aea09-5790-4fda-9a37-8ada0326c2d0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:46 crc kubenswrapper[4693]: I1204 10:03:46.163936 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls4d6\" (UniqueName: \"kubernetes.io/projected/df7aea09-5790-4fda-9a37-8ada0326c2d0-kube-api-access-ls4d6\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.124784 4693 generic.go:334] "Generic (PLEG): container finished" podID="39170f53-93c9-49fd-8dba-42d325269e74" containerID="1c9ebccfb99136aceae2e54e6fce75a38fc5e34d50a59200f02f59900eee5f2e" exitCode=0 Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.124869 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xz7js" event={"ID":"39170f53-93c9-49fd-8dba-42d325269e74","Type":"ContainerDied","Data":"1c9ebccfb99136aceae2e54e6fce75a38fc5e34d50a59200f02f59900eee5f2e"} Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.665270 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-9q6d6"] Dec 04 10:03:47 crc kubenswrapper[4693]: E1204 10:03:47.665603 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7f9fe71-06d0-4075-a262-74050f6b73d7" containerName="mariadb-account-create-update" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.665616 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f9fe71-06d0-4075-a262-74050f6b73d7" containerName="mariadb-account-create-update" Dec 04 10:03:47 crc kubenswrapper[4693]: E1204 10:03:47.665633 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1e98765-1b21-4b1a-80be-e5dc19d13082" containerName="mariadb-account-create-update" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.665640 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1e98765-1b21-4b1a-80be-e5dc19d13082" containerName="mariadb-account-create-update" Dec 04 10:03:47 crc kubenswrapper[4693]: E1204 10:03:47.665660 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561053fe-0024-4f25-bfab-94a8e139ac06" containerName="mariadb-database-create" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.665666 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="561053fe-0024-4f25-bfab-94a8e139ac06" containerName="mariadb-database-create" Dec 04 10:03:47 crc kubenswrapper[4693]: E1204 10:03:47.665675 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75" containerName="mariadb-database-create" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.665682 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75" containerName="mariadb-database-create" Dec 04 10:03:47 crc kubenswrapper[4693]: E1204 10:03:47.665691 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df7aea09-5790-4fda-9a37-8ada0326c2d0" containerName="mariadb-database-create" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.665699 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7aea09-5790-4fda-9a37-8ada0326c2d0" containerName="mariadb-database-create" Dec 04 10:03:47 crc kubenswrapper[4693]: E1204 10:03:47.665726 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14f8a950-a259-4279-99e9-d33a4fc93e7d" containerName="mariadb-account-create-update" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.665734 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="14f8a950-a259-4279-99e9-d33a4fc93e7d" containerName="mariadb-account-create-update" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.665933 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7f9fe71-06d0-4075-a262-74050f6b73d7" containerName="mariadb-account-create-update" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.665952 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75" containerName="mariadb-database-create" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.665962 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="561053fe-0024-4f25-bfab-94a8e139ac06" containerName="mariadb-database-create" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.665975 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1e98765-1b21-4b1a-80be-e5dc19d13082" containerName="mariadb-account-create-update" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.665988 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="df7aea09-5790-4fda-9a37-8ada0326c2d0" containerName="mariadb-database-create" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.665995 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="14f8a950-a259-4279-99e9-d33a4fc93e7d" containerName="mariadb-account-create-update" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.666746 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9q6d6" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.668739 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mdxbt" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.669232 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.682127 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9q6d6"] Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.685120 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6343580b-81bd-4993-a298-3b31730e6ae3-config-data\") pod \"glance-db-sync-9q6d6\" (UID: \"6343580b-81bd-4993-a298-3b31730e6ae3\") " pod="openstack/glance-db-sync-9q6d6" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.685349 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6343580b-81bd-4993-a298-3b31730e6ae3-db-sync-config-data\") pod \"glance-db-sync-9q6d6\" (UID: \"6343580b-81bd-4993-a298-3b31730e6ae3\") " pod="openstack/glance-db-sync-9q6d6" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.685547 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6343580b-81bd-4993-a298-3b31730e6ae3-combined-ca-bundle\") pod \"glance-db-sync-9q6d6\" (UID: \"6343580b-81bd-4993-a298-3b31730e6ae3\") " pod="openstack/glance-db-sync-9q6d6" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.685684 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr8d7\" (UniqueName: \"kubernetes.io/projected/6343580b-81bd-4993-a298-3b31730e6ae3-kube-api-access-dr8d7\") pod \"glance-db-sync-9q6d6\" (UID: \"6343580b-81bd-4993-a298-3b31730e6ae3\") " pod="openstack/glance-db-sync-9q6d6" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.787943 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6343580b-81bd-4993-a298-3b31730e6ae3-combined-ca-bundle\") pod \"glance-db-sync-9q6d6\" (UID: \"6343580b-81bd-4993-a298-3b31730e6ae3\") " pod="openstack/glance-db-sync-9q6d6" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.788043 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr8d7\" (UniqueName: \"kubernetes.io/projected/6343580b-81bd-4993-a298-3b31730e6ae3-kube-api-access-dr8d7\") pod \"glance-db-sync-9q6d6\" (UID: \"6343580b-81bd-4993-a298-3b31730e6ae3\") " pod="openstack/glance-db-sync-9q6d6" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.788185 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6343580b-81bd-4993-a298-3b31730e6ae3-config-data\") pod \"glance-db-sync-9q6d6\" (UID: \"6343580b-81bd-4993-a298-3b31730e6ae3\") " pod="openstack/glance-db-sync-9q6d6" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.788261 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6343580b-81bd-4993-a298-3b31730e6ae3-db-sync-config-data\") pod \"glance-db-sync-9q6d6\" (UID: \"6343580b-81bd-4993-a298-3b31730e6ae3\") " pod="openstack/glance-db-sync-9q6d6" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.793527 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6343580b-81bd-4993-a298-3b31730e6ae3-combined-ca-bundle\") pod \"glance-db-sync-9q6d6\" (UID: \"6343580b-81bd-4993-a298-3b31730e6ae3\") " pod="openstack/glance-db-sync-9q6d6" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.796311 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6343580b-81bd-4993-a298-3b31730e6ae3-db-sync-config-data\") pod \"glance-db-sync-9q6d6\" (UID: \"6343580b-81bd-4993-a298-3b31730e6ae3\") " pod="openstack/glance-db-sync-9q6d6" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.799252 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6343580b-81bd-4993-a298-3b31730e6ae3-config-data\") pod \"glance-db-sync-9q6d6\" (UID: \"6343580b-81bd-4993-a298-3b31730e6ae3\") " pod="openstack/glance-db-sync-9q6d6" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.805448 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr8d7\" (UniqueName: \"kubernetes.io/projected/6343580b-81bd-4993-a298-3b31730e6ae3-kube-api-access-dr8d7\") pod \"glance-db-sync-9q6d6\" (UID: \"6343580b-81bd-4993-a298-3b31730e6ae3\") " pod="openstack/glance-db-sync-9q6d6" Dec 04 10:03:47 crc kubenswrapper[4693]: I1204 10:03:47.983670 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9q6d6" Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.407196 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.603178 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxhfj\" (UniqueName: \"kubernetes.io/projected/39170f53-93c9-49fd-8dba-42d325269e74-kube-api-access-vxhfj\") pod \"39170f53-93c9-49fd-8dba-42d325269e74\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.603324 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/39170f53-93c9-49fd-8dba-42d325269e74-ring-data-devices\") pod \"39170f53-93c9-49fd-8dba-42d325269e74\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.603397 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/39170f53-93c9-49fd-8dba-42d325269e74-etc-swift\") pod \"39170f53-93c9-49fd-8dba-42d325269e74\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.603468 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39170f53-93c9-49fd-8dba-42d325269e74-scripts\") pod \"39170f53-93c9-49fd-8dba-42d325269e74\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.603496 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39170f53-93c9-49fd-8dba-42d325269e74-combined-ca-bundle\") pod \"39170f53-93c9-49fd-8dba-42d325269e74\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.603538 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/39170f53-93c9-49fd-8dba-42d325269e74-swiftconf\") pod \"39170f53-93c9-49fd-8dba-42d325269e74\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.603615 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/39170f53-93c9-49fd-8dba-42d325269e74-dispersionconf\") pod \"39170f53-93c9-49fd-8dba-42d325269e74\" (UID: \"39170f53-93c9-49fd-8dba-42d325269e74\") " Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.604133 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39170f53-93c9-49fd-8dba-42d325269e74-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "39170f53-93c9-49fd-8dba-42d325269e74" (UID: "39170f53-93c9-49fd-8dba-42d325269e74"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.606590 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39170f53-93c9-49fd-8dba-42d325269e74-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "39170f53-93c9-49fd-8dba-42d325269e74" (UID: "39170f53-93c9-49fd-8dba-42d325269e74"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.609570 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39170f53-93c9-49fd-8dba-42d325269e74-kube-api-access-vxhfj" (OuterVolumeSpecName: "kube-api-access-vxhfj") pod "39170f53-93c9-49fd-8dba-42d325269e74" (UID: "39170f53-93c9-49fd-8dba-42d325269e74"). InnerVolumeSpecName "kube-api-access-vxhfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.612022 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39170f53-93c9-49fd-8dba-42d325269e74-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "39170f53-93c9-49fd-8dba-42d325269e74" (UID: "39170f53-93c9-49fd-8dba-42d325269e74"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.629013 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39170f53-93c9-49fd-8dba-42d325269e74-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "39170f53-93c9-49fd-8dba-42d325269e74" (UID: "39170f53-93c9-49fd-8dba-42d325269e74"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.631420 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39170f53-93c9-49fd-8dba-42d325269e74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39170f53-93c9-49fd-8dba-42d325269e74" (UID: "39170f53-93c9-49fd-8dba-42d325269e74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.632573 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39170f53-93c9-49fd-8dba-42d325269e74-scripts" (OuterVolumeSpecName: "scripts") pod "39170f53-93c9-49fd-8dba-42d325269e74" (UID: "39170f53-93c9-49fd-8dba-42d325269e74"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.700530 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.704975 4693 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/39170f53-93c9-49fd-8dba-42d325269e74-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.705001 4693 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/39170f53-93c9-49fd-8dba-42d325269e74-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.705012 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39170f53-93c9-49fd-8dba-42d325269e74-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.705026 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39170f53-93c9-49fd-8dba-42d325269e74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.705038 4693 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/39170f53-93c9-49fd-8dba-42d325269e74-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.705051 4693 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/39170f53-93c9-49fd-8dba-42d325269e74-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:48 crc kubenswrapper[4693]: I1204 10:03:48.705062 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxhfj\" (UniqueName: \"kubernetes.io/projected/39170f53-93c9-49fd-8dba-42d325269e74-kube-api-access-vxhfj\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.144104 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xz7js" event={"ID":"39170f53-93c9-49fd-8dba-42d325269e74","Type":"ContainerDied","Data":"d040bc4e150b780285f1072d99c87a097173b3261a57b26eeafbee0d127ef09a"} Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.144323 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d040bc4e150b780285f1072d99c87a097173b3261a57b26eeafbee0d127ef09a" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.144185 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xz7js" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.165543 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-gxwt6"] Dec 04 10:03:49 crc kubenswrapper[4693]: E1204 10:03:49.165967 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39170f53-93c9-49fd-8dba-42d325269e74" containerName="swift-ring-rebalance" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.165984 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="39170f53-93c9-49fd-8dba-42d325269e74" containerName="swift-ring-rebalance" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.166241 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="39170f53-93c9-49fd-8dba-42d325269e74" containerName="swift-ring-rebalance" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.166852 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gxwt6" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.181949 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gxwt6"] Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.189759 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9q6d6"] Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.312726 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.313999 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d93060d5-ff5d-4f46-991d-b9b40c5d280d-operator-scripts\") pod \"cinder-db-create-gxwt6\" (UID: \"d93060d5-ff5d-4f46-991d-b9b40c5d280d\") " pod="openstack/cinder-db-create-gxwt6" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.314038 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rndv\" (UniqueName: \"kubernetes.io/projected/d93060d5-ff5d-4f46-991d-b9b40c5d280d-kube-api-access-6rndv\") pod \"cinder-db-create-gxwt6\" (UID: \"d93060d5-ff5d-4f46-991d-b9b40c5d280d\") " pod="openstack/cinder-db-create-gxwt6" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.316277 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-rc9st"] Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.317621 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-rc9st" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.323664 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-4211-account-create-update-r4vb9"] Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.324692 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4211-account-create-update-r4vb9" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.331497 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.338367 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-rc9st"] Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.347752 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-4211-account-create-update-r4vb9"] Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.415075 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6436522d-be9a-4cca-9928-8a001d22836e-operator-scripts\") pod \"manila-db-create-rc9st\" (UID: \"6436522d-be9a-4cca-9928-8a001d22836e\") " pod="openstack/manila-db-create-rc9st" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.415256 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d93060d5-ff5d-4f46-991d-b9b40c5d280d-operator-scripts\") pod \"cinder-db-create-gxwt6\" (UID: \"d93060d5-ff5d-4f46-991d-b9b40c5d280d\") " pod="openstack/cinder-db-create-gxwt6" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.415275 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2fdl\" (UniqueName: \"kubernetes.io/projected/6436522d-be9a-4cca-9928-8a001d22836e-kube-api-access-x2fdl\") pod \"manila-db-create-rc9st\" (UID: \"6436522d-be9a-4cca-9928-8a001d22836e\") " pod="openstack/manila-db-create-rc9st" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.415296 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rndv\" (UniqueName: \"kubernetes.io/projected/d93060d5-ff5d-4f46-991d-b9b40c5d280d-kube-api-access-6rndv\") pod \"cinder-db-create-gxwt6\" (UID: \"d93060d5-ff5d-4f46-991d-b9b40c5d280d\") " pod="openstack/cinder-db-create-gxwt6" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.416679 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d93060d5-ff5d-4f46-991d-b9b40c5d280d-operator-scripts\") pod \"cinder-db-create-gxwt6\" (UID: \"d93060d5-ff5d-4f46-991d-b9b40c5d280d\") " pod="openstack/cinder-db-create-gxwt6" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.440138 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rndv\" (UniqueName: \"kubernetes.io/projected/d93060d5-ff5d-4f46-991d-b9b40c5d280d-kube-api-access-6rndv\") pod \"cinder-db-create-gxwt6\" (UID: \"d93060d5-ff5d-4f46-991d-b9b40c5d280d\") " pod="openstack/cinder-db-create-gxwt6" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.440664 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ef17-account-create-update-wcgbx"] Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.442156 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ef17-account-create-update-wcgbx" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.449798 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ef17-account-create-update-wcgbx"] Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.453061 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.519834 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbvbn\" (UniqueName: \"kubernetes.io/projected/a392a0d0-239b-4be6-896c-3da1a401c361-kube-api-access-dbvbn\") pod \"manila-4211-account-create-update-r4vb9\" (UID: \"a392a0d0-239b-4be6-896c-3da1a401c361\") " pod="openstack/manila-4211-account-create-update-r4vb9" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.519889 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6436522d-be9a-4cca-9928-8a001d22836e-operator-scripts\") pod \"manila-db-create-rc9st\" (UID: \"6436522d-be9a-4cca-9928-8a001d22836e\") " pod="openstack/manila-db-create-rc9st" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.519962 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a392a0d0-239b-4be6-896c-3da1a401c361-operator-scripts\") pod \"manila-4211-account-create-update-r4vb9\" (UID: \"a392a0d0-239b-4be6-896c-3da1a401c361\") " pod="openstack/manila-4211-account-create-update-r4vb9" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.520008 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2fdl\" (UniqueName: \"kubernetes.io/projected/6436522d-be9a-4cca-9928-8a001d22836e-kube-api-access-x2fdl\") pod \"manila-db-create-rc9st\" (UID: \"6436522d-be9a-4cca-9928-8a001d22836e\") " pod="openstack/manila-db-create-rc9st" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.525558 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gxwt6" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.526323 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6436522d-be9a-4cca-9928-8a001d22836e-operator-scripts\") pod \"manila-db-create-rc9st\" (UID: \"6436522d-be9a-4cca-9928-8a001d22836e\") " pod="openstack/manila-db-create-rc9st" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.531658 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-sk4bl"] Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.532972 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sk4bl" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.553609 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2fdl\" (UniqueName: \"kubernetes.io/projected/6436522d-be9a-4cca-9928-8a001d22836e-kube-api-access-x2fdl\") pod \"manila-db-create-rc9st\" (UID: \"6436522d-be9a-4cca-9928-8a001d22836e\") " pod="openstack/manila-db-create-rc9st" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.574407 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-sk4bl"] Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.621537 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ae08-account-create-update-xbp2n"] Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.621757 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r85w\" (UniqueName: \"kubernetes.io/projected/7ed103ee-266a-4848-a410-2e01ea8f694a-kube-api-access-9r85w\") pod \"barbican-db-create-sk4bl\" (UID: \"7ed103ee-266a-4848-a410-2e01ea8f694a\") " pod="openstack/barbican-db-create-sk4bl" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.621815 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a392a0d0-239b-4be6-896c-3da1a401c361-operator-scripts\") pod \"manila-4211-account-create-update-r4vb9\" (UID: \"a392a0d0-239b-4be6-896c-3da1a401c361\") " pod="openstack/manila-4211-account-create-update-r4vb9" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.621862 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ba2460c-1d0a-4b7b-8159-5c94778aab54-operator-scripts\") pod \"cinder-ef17-account-create-update-wcgbx\" (UID: \"4ba2460c-1d0a-4b7b-8159-5c94778aab54\") " pod="openstack/cinder-ef17-account-create-update-wcgbx" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.621886 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ed103ee-266a-4848-a410-2e01ea8f694a-operator-scripts\") pod \"barbican-db-create-sk4bl\" (UID: \"7ed103ee-266a-4848-a410-2e01ea8f694a\") " pod="openstack/barbican-db-create-sk4bl" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.621956 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvnjj\" (UniqueName: \"kubernetes.io/projected/4ba2460c-1d0a-4b7b-8159-5c94778aab54-kube-api-access-tvnjj\") pod \"cinder-ef17-account-create-update-wcgbx\" (UID: \"4ba2460c-1d0a-4b7b-8159-5c94778aab54\") " pod="openstack/cinder-ef17-account-create-update-wcgbx" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.622056 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbvbn\" (UniqueName: \"kubernetes.io/projected/a392a0d0-239b-4be6-896c-3da1a401c361-kube-api-access-dbvbn\") pod \"manila-4211-account-create-update-r4vb9\" (UID: \"a392a0d0-239b-4be6-896c-3da1a401c361\") " pod="openstack/manila-4211-account-create-update-r4vb9" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.622523 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ae08-account-create-update-xbp2n" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.624657 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a392a0d0-239b-4be6-896c-3da1a401c361-operator-scripts\") pod \"manila-4211-account-create-update-r4vb9\" (UID: \"a392a0d0-239b-4be6-896c-3da1a401c361\") " pod="openstack/manila-4211-account-create-update-r4vb9" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.628964 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.634288 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-rc9st" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.636601 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ae08-account-create-update-xbp2n"] Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.647759 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbvbn\" (UniqueName: \"kubernetes.io/projected/a392a0d0-239b-4be6-896c-3da1a401c361-kube-api-access-dbvbn\") pod \"manila-4211-account-create-update-r4vb9\" (UID: \"a392a0d0-239b-4be6-896c-3da1a401c361\") " pod="openstack/manila-4211-account-create-update-r4vb9" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.722208 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0576-account-create-update-gqnh2"] Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.723361 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0576-account-create-update-gqnh2" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.725388 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ed103ee-266a-4848-a410-2e01ea8f694a-operator-scripts\") pod \"barbican-db-create-sk4bl\" (UID: \"7ed103ee-266a-4848-a410-2e01ea8f694a\") " pod="openstack/barbican-db-create-sk4bl" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.725450 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ba2460c-1d0a-4b7b-8159-5c94778aab54-operator-scripts\") pod \"cinder-ef17-account-create-update-wcgbx\" (UID: \"4ba2460c-1d0a-4b7b-8159-5c94778aab54\") " pod="openstack/cinder-ef17-account-create-update-wcgbx" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.725522 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvnjj\" (UniqueName: \"kubernetes.io/projected/4ba2460c-1d0a-4b7b-8159-5c94778aab54-kube-api-access-tvnjj\") pod \"cinder-ef17-account-create-update-wcgbx\" (UID: \"4ba2460c-1d0a-4b7b-8159-5c94778aab54\") " pod="openstack/cinder-ef17-account-create-update-wcgbx" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.725654 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqq8p\" (UniqueName: \"kubernetes.io/projected/55360827-2b39-4864-8d98-deb8d8fca9cc-kube-api-access-kqq8p\") pod \"neutron-ae08-account-create-update-xbp2n\" (UID: \"55360827-2b39-4864-8d98-deb8d8fca9cc\") " pod="openstack/neutron-ae08-account-create-update-xbp2n" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.725704 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r85w\" (UniqueName: \"kubernetes.io/projected/7ed103ee-266a-4848-a410-2e01ea8f694a-kube-api-access-9r85w\") pod \"barbican-db-create-sk4bl\" (UID: \"7ed103ee-266a-4848-a410-2e01ea8f694a\") " pod="openstack/barbican-db-create-sk4bl" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.725730 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55360827-2b39-4864-8d98-deb8d8fca9cc-operator-scripts\") pod \"neutron-ae08-account-create-update-xbp2n\" (UID: \"55360827-2b39-4864-8d98-deb8d8fca9cc\") " pod="openstack/neutron-ae08-account-create-update-xbp2n" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.726386 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ba2460c-1d0a-4b7b-8159-5c94778aab54-operator-scripts\") pod \"cinder-ef17-account-create-update-wcgbx\" (UID: \"4ba2460c-1d0a-4b7b-8159-5c94778aab54\") " pod="openstack/cinder-ef17-account-create-update-wcgbx" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.726470 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ed103ee-266a-4848-a410-2e01ea8f694a-operator-scripts\") pod \"barbican-db-create-sk4bl\" (UID: \"7ed103ee-266a-4848-a410-2e01ea8f694a\") " pod="openstack/barbican-db-create-sk4bl" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.727129 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.735661 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0576-account-create-update-gqnh2"] Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.746003 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r85w\" (UniqueName: \"kubernetes.io/projected/7ed103ee-266a-4848-a410-2e01ea8f694a-kube-api-access-9r85w\") pod \"barbican-db-create-sk4bl\" (UID: \"7ed103ee-266a-4848-a410-2e01ea8f694a\") " pod="openstack/barbican-db-create-sk4bl" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.755946 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvnjj\" (UniqueName: \"kubernetes.io/projected/4ba2460c-1d0a-4b7b-8159-5c94778aab54-kube-api-access-tvnjj\") pod \"cinder-ef17-account-create-update-wcgbx\" (UID: \"4ba2460c-1d0a-4b7b-8159-5c94778aab54\") " pod="openstack/cinder-ef17-account-create-update-wcgbx" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.799363 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ef17-account-create-update-wcgbx" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.820864 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-j5pgf"] Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.821832 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j5pgf" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.827632 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqq8p\" (UniqueName: \"kubernetes.io/projected/55360827-2b39-4864-8d98-deb8d8fca9cc-kube-api-access-kqq8p\") pod \"neutron-ae08-account-create-update-xbp2n\" (UID: \"55360827-2b39-4864-8d98-deb8d8fca9cc\") " pod="openstack/neutron-ae08-account-create-update-xbp2n" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.827684 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c48aae9-77d8-4b25-989c-7b51c5938929-operator-scripts\") pod \"barbican-0576-account-create-update-gqnh2\" (UID: \"1c48aae9-77d8-4b25-989c-7b51c5938929\") " pod="openstack/barbican-0576-account-create-update-gqnh2" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.827717 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55360827-2b39-4864-8d98-deb8d8fca9cc-operator-scripts\") pod \"neutron-ae08-account-create-update-xbp2n\" (UID: \"55360827-2b39-4864-8d98-deb8d8fca9cc\") " pod="openstack/neutron-ae08-account-create-update-xbp2n" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.827769 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnq4n\" (UniqueName: \"kubernetes.io/projected/1c48aae9-77d8-4b25-989c-7b51c5938929-kube-api-access-gnq4n\") pod \"barbican-0576-account-create-update-gqnh2\" (UID: \"1c48aae9-77d8-4b25-989c-7b51c5938929\") " pod="openstack/barbican-0576-account-create-update-gqnh2" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.828751 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55360827-2b39-4864-8d98-deb8d8fca9cc-operator-scripts\") pod \"neutron-ae08-account-create-update-xbp2n\" (UID: \"55360827-2b39-4864-8d98-deb8d8fca9cc\") " pod="openstack/neutron-ae08-account-create-update-xbp2n" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.834622 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-j5pgf"] Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.852578 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqq8p\" (UniqueName: \"kubernetes.io/projected/55360827-2b39-4864-8d98-deb8d8fca9cc-kube-api-access-kqq8p\") pod \"neutron-ae08-account-create-update-xbp2n\" (UID: \"55360827-2b39-4864-8d98-deb8d8fca9cc\") " pod="openstack/neutron-ae08-account-create-update-xbp2n" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.894678 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sk4bl" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.929057 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7f29\" (UniqueName: \"kubernetes.io/projected/7cafe793-8113-4ef4-ada0-5e699a68ea59-kube-api-access-z7f29\") pod \"neutron-db-create-j5pgf\" (UID: \"7cafe793-8113-4ef4-ada0-5e699a68ea59\") " pod="openstack/neutron-db-create-j5pgf" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.929140 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnq4n\" (UniqueName: \"kubernetes.io/projected/1c48aae9-77d8-4b25-989c-7b51c5938929-kube-api-access-gnq4n\") pod \"barbican-0576-account-create-update-gqnh2\" (UID: \"1c48aae9-77d8-4b25-989c-7b51c5938929\") " pod="openstack/barbican-0576-account-create-update-gqnh2" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.929205 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cafe793-8113-4ef4-ada0-5e699a68ea59-operator-scripts\") pod \"neutron-db-create-j5pgf\" (UID: \"7cafe793-8113-4ef4-ada0-5e699a68ea59\") " pod="openstack/neutron-db-create-j5pgf" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.929298 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c48aae9-77d8-4b25-989c-7b51c5938929-operator-scripts\") pod \"barbican-0576-account-create-update-gqnh2\" (UID: \"1c48aae9-77d8-4b25-989c-7b51c5938929\") " pod="openstack/barbican-0576-account-create-update-gqnh2" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.930008 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c48aae9-77d8-4b25-989c-7b51c5938929-operator-scripts\") pod \"barbican-0576-account-create-update-gqnh2\" (UID: \"1c48aae9-77d8-4b25-989c-7b51c5938929\") " pod="openstack/barbican-0576-account-create-update-gqnh2" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.942479 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4211-account-create-update-r4vb9" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.943732 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ae08-account-create-update-xbp2n" Dec 04 10:03:49 crc kubenswrapper[4693]: I1204 10:03:49.948267 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnq4n\" (UniqueName: \"kubernetes.io/projected/1c48aae9-77d8-4b25-989c-7b51c5938929-kube-api-access-gnq4n\") pod \"barbican-0576-account-create-update-gqnh2\" (UID: \"1c48aae9-77d8-4b25-989c-7b51c5938929\") " pod="openstack/barbican-0576-account-create-update-gqnh2" Dec 04 10:03:50 crc kubenswrapper[4693]: I1204 10:03:50.030567 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7f29\" (UniqueName: \"kubernetes.io/projected/7cafe793-8113-4ef4-ada0-5e699a68ea59-kube-api-access-z7f29\") pod \"neutron-db-create-j5pgf\" (UID: \"7cafe793-8113-4ef4-ada0-5e699a68ea59\") " pod="openstack/neutron-db-create-j5pgf" Dec 04 10:03:50 crc kubenswrapper[4693]: I1204 10:03:50.030998 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cafe793-8113-4ef4-ada0-5e699a68ea59-operator-scripts\") pod \"neutron-db-create-j5pgf\" (UID: \"7cafe793-8113-4ef4-ada0-5e699a68ea59\") " pod="openstack/neutron-db-create-j5pgf" Dec 04 10:03:50 crc kubenswrapper[4693]: I1204 10:03:50.031708 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cafe793-8113-4ef4-ada0-5e699a68ea59-operator-scripts\") pod \"neutron-db-create-j5pgf\" (UID: \"7cafe793-8113-4ef4-ada0-5e699a68ea59\") " pod="openstack/neutron-db-create-j5pgf" Dec 04 10:03:50 crc kubenswrapper[4693]: I1204 10:03:50.041342 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0576-account-create-update-gqnh2" Dec 04 10:03:50 crc kubenswrapper[4693]: I1204 10:03:50.050955 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7f29\" (UniqueName: \"kubernetes.io/projected/7cafe793-8113-4ef4-ada0-5e699a68ea59-kube-api-access-z7f29\") pod \"neutron-db-create-j5pgf\" (UID: \"7cafe793-8113-4ef4-ada0-5e699a68ea59\") " pod="openstack/neutron-db-create-j5pgf" Dec 04 10:03:50 crc kubenswrapper[4693]: I1204 10:03:50.157602 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9q6d6" event={"ID":"6343580b-81bd-4993-a298-3b31730e6ae3","Type":"ContainerStarted","Data":"b39baba43558b8cc997c029362a65f3366d1bb5f87bac260311832c77619c2bd"} Dec 04 10:03:50 crc kubenswrapper[4693]: I1204 10:03:50.235788 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j5pgf" Dec 04 10:03:51 crc kubenswrapper[4693]: I1204 10:03:51.329214 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 04 10:03:51 crc kubenswrapper[4693]: I1204 10:03:51.644234 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gxwt6"] Dec 04 10:03:51 crc kubenswrapper[4693]: I1204 10:03:51.727813 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0576-account-create-update-gqnh2"] Dec 04 10:03:51 crc kubenswrapper[4693]: W1204 10:03:51.744448 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ba2460c_1d0a_4b7b_8159_5c94778aab54.slice/crio-c92b03865ae5ed73d9d6d04d27990161c0e4dc711e9b24f9f65c90570690c3a6 WatchSource:0}: Error finding container c92b03865ae5ed73d9d6d04d27990161c0e4dc711e9b24f9f65c90570690c3a6: Status 404 returned error can't find the container with id c92b03865ae5ed73d9d6d04d27990161c0e4dc711e9b24f9f65c90570690c3a6 Dec 04 10:03:51 crc kubenswrapper[4693]: W1204 10:03:51.748589 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c48aae9_77d8_4b25_989c_7b51c5938929.slice/crio-6a5e37ac4331c05338720461f1fbd704c3f4255fdde63db94a4eac86b7d09b7c WatchSource:0}: Error finding container 6a5e37ac4331c05338720461f1fbd704c3f4255fdde63db94a4eac86b7d09b7c: Status 404 returned error can't find the container with id 6a5e37ac4331c05338720461f1fbd704c3f4255fdde63db94a4eac86b7d09b7c Dec 04 10:03:51 crc kubenswrapper[4693]: I1204 10:03:51.751920 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ef17-account-create-update-wcgbx"] Dec 04 10:03:51 crc kubenswrapper[4693]: W1204 10:03:51.758052 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cafe793_8113_4ef4_ada0_5e699a68ea59.slice/crio-66dfc5a204484f13d4ea273fb6741611aa1b5604fd1628eec97e9dc1eb637dc3 WatchSource:0}: Error finding container 66dfc5a204484f13d4ea273fb6741611aa1b5604fd1628eec97e9dc1eb637dc3: Status 404 returned error can't find the container with id 66dfc5a204484f13d4ea273fb6741611aa1b5604fd1628eec97e9dc1eb637dc3 Dec 04 10:03:51 crc kubenswrapper[4693]: I1204 10:03:51.759451 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-j5pgf"] Dec 04 10:03:51 crc kubenswrapper[4693]: I1204 10:03:51.840300 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ae08-account-create-update-xbp2n"] Dec 04 10:03:51 crc kubenswrapper[4693]: I1204 10:03:51.852632 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-rc9st"] Dec 04 10:03:51 crc kubenswrapper[4693]: I1204 10:03:51.871985 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-sk4bl"] Dec 04 10:03:51 crc kubenswrapper[4693]: I1204 10:03:51.889612 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-4211-account-create-update-r4vb9"] Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.195889 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ef17-account-create-update-wcgbx" event={"ID":"4ba2460c-1d0a-4b7b-8159-5c94778aab54","Type":"ContainerStarted","Data":"c92b03865ae5ed73d9d6d04d27990161c0e4dc711e9b24f9f65c90570690c3a6"} Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.198170 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gxwt6" event={"ID":"d93060d5-ff5d-4f46-991d-b9b40c5d280d","Type":"ContainerStarted","Data":"2fc781017589709b37face0663bc5eee93bd18cc938013abfe23baba3ff4ceb9"} Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.199645 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ae08-account-create-update-xbp2n" event={"ID":"55360827-2b39-4864-8d98-deb8d8fca9cc","Type":"ContainerStarted","Data":"5feed33106b110d229c9c14edcd016b4ca8955082f774e091c80d9068904fe6e"} Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.200555 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-rc9st" event={"ID":"6436522d-be9a-4cca-9928-8a001d22836e","Type":"ContainerStarted","Data":"3a9471d0d8da468941af37506de58995cfef2164cbcd2cb980bb6fe36c318670"} Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.202847 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sk4bl" event={"ID":"7ed103ee-266a-4848-a410-2e01ea8f694a","Type":"ContainerStarted","Data":"bf415c7e35a7dc109bfa53e22cfc1e710384ba3004b4e8cdd6d9ae70722c387b"} Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.203733 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0576-account-create-update-gqnh2" event={"ID":"1c48aae9-77d8-4b25-989c-7b51c5938929","Type":"ContainerStarted","Data":"6a5e37ac4331c05338720461f1fbd704c3f4255fdde63db94a4eac86b7d09b7c"} Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.204407 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4211-account-create-update-r4vb9" event={"ID":"a392a0d0-239b-4be6-896c-3da1a401c361","Type":"ContainerStarted","Data":"eb2c8b1b658a301d84b8521ccb7716649ec41c9eaac915e1ac8e7d7339c18b69"} Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.205018 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-j5pgf" event={"ID":"7cafe793-8113-4ef4-ada0-5e699a68ea59","Type":"ContainerStarted","Data":"66dfc5a204484f13d4ea273fb6741611aa1b5604fd1628eec97e9dc1eb637dc3"} Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.274123 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.274174 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.274221 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.275051 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"17b9e6dc7a80fb27b6e0a13555809c35b6f6158654239504ab57d225574ce7bf"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.275114 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://17b9e6dc7a80fb27b6e0a13555809c35b6f6158654239504ab57d225574ce7bf" gracePeriod=600 Dec 04 10:03:52 crc kubenswrapper[4693]: E1204 10:03:52.449422 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4f65408_7d18_47db_8a19_f9be435dd348.slice/crio-17b9e6dc7a80fb27b6e0a13555809c35b6f6158654239504ab57d225574ce7bf.scope\": RecentStats: unable to find data in memory cache]" Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.524442 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-4zr9z"] Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.527021 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4zr9z"] Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.527113 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4zr9z" Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.529918 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.530654 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.531203 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9fq9s" Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.531347 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.696110 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e565feb-3867-434a-9b7e-9cae0a2f9152-combined-ca-bundle\") pod \"keystone-db-sync-4zr9z\" (UID: \"4e565feb-3867-434a-9b7e-9cae0a2f9152\") " pod="openstack/keystone-db-sync-4zr9z" Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.696279 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e565feb-3867-434a-9b7e-9cae0a2f9152-config-data\") pod \"keystone-db-sync-4zr9z\" (UID: \"4e565feb-3867-434a-9b7e-9cae0a2f9152\") " pod="openstack/keystone-db-sync-4zr9z" Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.696508 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx4qn\" (UniqueName: \"kubernetes.io/projected/4e565feb-3867-434a-9b7e-9cae0a2f9152-kube-api-access-qx4qn\") pod \"keystone-db-sync-4zr9z\" (UID: \"4e565feb-3867-434a-9b7e-9cae0a2f9152\") " pod="openstack/keystone-db-sync-4zr9z" Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.798782 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e565feb-3867-434a-9b7e-9cae0a2f9152-combined-ca-bundle\") pod \"keystone-db-sync-4zr9z\" (UID: \"4e565feb-3867-434a-9b7e-9cae0a2f9152\") " pod="openstack/keystone-db-sync-4zr9z" Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.798847 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e565feb-3867-434a-9b7e-9cae0a2f9152-config-data\") pod \"keystone-db-sync-4zr9z\" (UID: \"4e565feb-3867-434a-9b7e-9cae0a2f9152\") " pod="openstack/keystone-db-sync-4zr9z" Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.798881 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx4qn\" (UniqueName: \"kubernetes.io/projected/4e565feb-3867-434a-9b7e-9cae0a2f9152-kube-api-access-qx4qn\") pod \"keystone-db-sync-4zr9z\" (UID: \"4e565feb-3867-434a-9b7e-9cae0a2f9152\") " pod="openstack/keystone-db-sync-4zr9z" Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.804787 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e565feb-3867-434a-9b7e-9cae0a2f9152-combined-ca-bundle\") pod \"keystone-db-sync-4zr9z\" (UID: \"4e565feb-3867-434a-9b7e-9cae0a2f9152\") " pod="openstack/keystone-db-sync-4zr9z" Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.805074 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e565feb-3867-434a-9b7e-9cae0a2f9152-config-data\") pod \"keystone-db-sync-4zr9z\" (UID: \"4e565feb-3867-434a-9b7e-9cae0a2f9152\") " pod="openstack/keystone-db-sync-4zr9z" Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.818951 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx4qn\" (UniqueName: \"kubernetes.io/projected/4e565feb-3867-434a-9b7e-9cae0a2f9152-kube-api-access-qx4qn\") pod \"keystone-db-sync-4zr9z\" (UID: \"4e565feb-3867-434a-9b7e-9cae0a2f9152\") " pod="openstack/keystone-db-sync-4zr9z" Dec 04 10:03:52 crc kubenswrapper[4693]: I1204 10:03:52.850002 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4zr9z" Dec 04 10:03:53 crc kubenswrapper[4693]: I1204 10:03:53.229852 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sk4bl" event={"ID":"7ed103ee-266a-4848-a410-2e01ea8f694a","Type":"ContainerStarted","Data":"305c7316da231b936393e56f4c94571114d9dc064bbf3f8ce2d05010597a5964"} Dec 04 10:03:53 crc kubenswrapper[4693]: I1204 10:03:53.232747 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4211-account-create-update-r4vb9" event={"ID":"a392a0d0-239b-4be6-896c-3da1a401c361","Type":"ContainerStarted","Data":"8b75043cd3fb50f0f4d747dd8ca8914186d73fadd142fd55ddb872745a3513e8"} Dec 04 10:03:53 crc kubenswrapper[4693]: I1204 10:03:53.236735 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-j5pgf" event={"ID":"7cafe793-8113-4ef4-ada0-5e699a68ea59","Type":"ContainerStarted","Data":"728e17ef8b0681df2395de98aa589cd1ee8e766b880f4843de777c6c7352820f"} Dec 04 10:03:53 crc kubenswrapper[4693]: I1204 10:03:53.254152 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ef17-account-create-update-wcgbx" event={"ID":"4ba2460c-1d0a-4b7b-8159-5c94778aab54","Type":"ContainerStarted","Data":"0f8c449a5b8b7ad671abb92b5de3898b5c1062812568636c7b6e76bcce45212c"} Dec 04 10:03:53 crc kubenswrapper[4693]: I1204 10:03:53.265904 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-rc9st" event={"ID":"6436522d-be9a-4cca-9928-8a001d22836e","Type":"ContainerStarted","Data":"7ee28d74d1c1c14f248f78b43ea1615867ee2d955debb12ff85f1b44167e2b37"} Dec 04 10:03:53 crc kubenswrapper[4693]: I1204 10:03:53.265942 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-j5pgf" podStartSLOduration=4.265928993 podStartE2EDuration="4.265928993s" podCreationTimestamp="2025-12-04 10:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:03:53.258878729 +0000 UTC m=+1279.156472482" watchObservedRunningTime="2025-12-04 10:03:53.265928993 +0000 UTC m=+1279.163522746" Dec 04 10:03:53 crc kubenswrapper[4693]: I1204 10:03:53.287548 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="17b9e6dc7a80fb27b6e0a13555809c35b6f6158654239504ab57d225574ce7bf" exitCode=0 Dec 04 10:03:53 crc kubenswrapper[4693]: I1204 10:03:53.287694 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"17b9e6dc7a80fb27b6e0a13555809c35b6f6158654239504ab57d225574ce7bf"} Dec 04 10:03:53 crc kubenswrapper[4693]: I1204 10:03:53.293410 4693 scope.go:117] "RemoveContainer" containerID="fa8175e0c93e033e33e0a667d8b04220a3901467b64921658c170a933445969a" Dec 04 10:03:53 crc kubenswrapper[4693]: I1204 10:03:53.294569 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-ef17-account-create-update-wcgbx" podStartSLOduration=4.294557394 podStartE2EDuration="4.294557394s" podCreationTimestamp="2025-12-04 10:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:03:53.278732597 +0000 UTC m=+1279.176326350" watchObservedRunningTime="2025-12-04 10:03:53.294557394 +0000 UTC m=+1279.192151147" Dec 04 10:03:53 crc kubenswrapper[4693]: I1204 10:03:53.319746 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0576-account-create-update-gqnh2" event={"ID":"1c48aae9-77d8-4b25-989c-7b51c5938929","Type":"ContainerStarted","Data":"662439998a3072442ca04691f337bcd55a5184dab1b0b487322fbf785af52ec7"} Dec 04 10:03:53 crc kubenswrapper[4693]: I1204 10:03:53.320379 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-rc9st" podStartSLOduration=4.320356616 podStartE2EDuration="4.320356616s" podCreationTimestamp="2025-12-04 10:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:03:53.308353405 +0000 UTC m=+1279.205947168" watchObservedRunningTime="2025-12-04 10:03:53.320356616 +0000 UTC m=+1279.217950369" Dec 04 10:03:53 crc kubenswrapper[4693]: I1204 10:03:53.342216 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4zr9z"] Dec 04 10:03:53 crc kubenswrapper[4693]: I1204 10:03:53.343139 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-0576-account-create-update-gqnh2" podStartSLOduration=4.343129324 podStartE2EDuration="4.343129324s" podCreationTimestamp="2025-12-04 10:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:03:53.342687553 +0000 UTC m=+1279.240281306" watchObservedRunningTime="2025-12-04 10:03:53.343129324 +0000 UTC m=+1279.240723077" Dec 04 10:03:53 crc kubenswrapper[4693]: I1204 10:03:53.350172 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gxwt6" event={"ID":"d93060d5-ff5d-4f46-991d-b9b40c5d280d","Type":"ContainerStarted","Data":"b85d1a787743ad35f3711e020be32203b24a8d6da34cb07a14f0877594cbd4e4"} Dec 04 10:03:53 crc kubenswrapper[4693]: I1204 10:03:53.360672 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ae08-account-create-update-xbp2n" event={"ID":"55360827-2b39-4864-8d98-deb8d8fca9cc","Type":"ContainerStarted","Data":"c80f2f65e9a2b955f175661de526c8dc4736042f77434b1c2ccd6eea7df00287"} Dec 04 10:03:53 crc kubenswrapper[4693]: I1204 10:03:53.372059 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-gxwt6" podStartSLOduration=4.372041883 podStartE2EDuration="4.372041883s" podCreationTimestamp="2025-12-04 10:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:03:53.367324543 +0000 UTC m=+1279.264918306" watchObservedRunningTime="2025-12-04 10:03:53.372041883 +0000 UTC m=+1279.269635636" Dec 04 10:03:53 crc kubenswrapper[4693]: I1204 10:03:53.390887 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-ae08-account-create-update-xbp2n" podStartSLOduration=4.390817371 podStartE2EDuration="4.390817371s" podCreationTimestamp="2025-12-04 10:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:03:53.383279383 +0000 UTC m=+1279.280873136" watchObservedRunningTime="2025-12-04 10:03:53.390817371 +0000 UTC m=+1279.288411134" Dec 04 10:03:54 crc kubenswrapper[4693]: I1204 10:03:54.370887 4693 generic.go:334] "Generic (PLEG): container finished" podID="7cafe793-8113-4ef4-ada0-5e699a68ea59" containerID="728e17ef8b0681df2395de98aa589cd1ee8e766b880f4843de777c6c7352820f" exitCode=0 Dec 04 10:03:54 crc kubenswrapper[4693]: I1204 10:03:54.370929 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-j5pgf" event={"ID":"7cafe793-8113-4ef4-ada0-5e699a68ea59","Type":"ContainerDied","Data":"728e17ef8b0681df2395de98aa589cd1ee8e766b880f4843de777c6c7352820f"} Dec 04 10:03:54 crc kubenswrapper[4693]: I1204 10:03:54.375536 4693 generic.go:334] "Generic (PLEG): container finished" podID="d93060d5-ff5d-4f46-991d-b9b40c5d280d" containerID="b85d1a787743ad35f3711e020be32203b24a8d6da34cb07a14f0877594cbd4e4" exitCode=0 Dec 04 10:03:54 crc kubenswrapper[4693]: I1204 10:03:54.375607 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gxwt6" event={"ID":"d93060d5-ff5d-4f46-991d-b9b40c5d280d","Type":"ContainerDied","Data":"b85d1a787743ad35f3711e020be32203b24a8d6da34cb07a14f0877594cbd4e4"} Dec 04 10:03:54 crc kubenswrapper[4693]: I1204 10:03:54.378962 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4zr9z" event={"ID":"4e565feb-3867-434a-9b7e-9cae0a2f9152","Type":"ContainerStarted","Data":"938237148dc655395c953c070bce8359a52d62610c18da16db76142037d9d7da"} Dec 04 10:03:54 crc kubenswrapper[4693]: I1204 10:03:54.380495 4693 generic.go:334] "Generic (PLEG): container finished" podID="6436522d-be9a-4cca-9928-8a001d22836e" containerID="7ee28d74d1c1c14f248f78b43ea1615867ee2d955debb12ff85f1b44167e2b37" exitCode=0 Dec 04 10:03:54 crc kubenswrapper[4693]: I1204 10:03:54.380610 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-rc9st" event={"ID":"6436522d-be9a-4cca-9928-8a001d22836e","Type":"ContainerDied","Data":"7ee28d74d1c1c14f248f78b43ea1615867ee2d955debb12ff85f1b44167e2b37"} Dec 04 10:03:54 crc kubenswrapper[4693]: I1204 10:03:54.389078 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"b7bd03640b7e4a33a647c5d1603e98e993284c3b724300f1b3ae4227fa75eb8c"} Dec 04 10:03:54 crc kubenswrapper[4693]: I1204 10:03:54.391942 4693 generic.go:334] "Generic (PLEG): container finished" podID="7ed103ee-266a-4848-a410-2e01ea8f694a" containerID="305c7316da231b936393e56f4c94571114d9dc064bbf3f8ce2d05010597a5964" exitCode=0 Dec 04 10:03:54 crc kubenswrapper[4693]: I1204 10:03:54.392513 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sk4bl" event={"ID":"7ed103ee-266a-4848-a410-2e01ea8f694a","Type":"ContainerDied","Data":"305c7316da231b936393e56f4c94571114d9dc064bbf3f8ce2d05010597a5964"} Dec 04 10:03:54 crc kubenswrapper[4693]: I1204 10:03:54.458839 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-4211-account-create-update-r4vb9" podStartSLOduration=5.458821255 podStartE2EDuration="5.458821255s" podCreationTimestamp="2025-12-04 10:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:03:54.449292062 +0000 UTC m=+1280.346885815" watchObservedRunningTime="2025-12-04 10:03:54.458821255 +0000 UTC m=+1280.356415008" Dec 04 10:03:55 crc kubenswrapper[4693]: I1204 10:03:55.404118 4693 generic.go:334] "Generic (PLEG): container finished" podID="1c48aae9-77d8-4b25-989c-7b51c5938929" containerID="662439998a3072442ca04691f337bcd55a5184dab1b0b487322fbf785af52ec7" exitCode=0 Dec 04 10:03:55 crc kubenswrapper[4693]: I1204 10:03:55.404206 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0576-account-create-update-gqnh2" event={"ID":"1c48aae9-77d8-4b25-989c-7b51c5938929","Type":"ContainerDied","Data":"662439998a3072442ca04691f337bcd55a5184dab1b0b487322fbf785af52ec7"} Dec 04 10:03:55 crc kubenswrapper[4693]: I1204 10:03:55.406850 4693 generic.go:334] "Generic (PLEG): container finished" podID="a392a0d0-239b-4be6-896c-3da1a401c361" containerID="8b75043cd3fb50f0f4d747dd8ca8914186d73fadd142fd55ddb872745a3513e8" exitCode=0 Dec 04 10:03:55 crc kubenswrapper[4693]: I1204 10:03:55.406894 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4211-account-create-update-r4vb9" event={"ID":"a392a0d0-239b-4be6-896c-3da1a401c361","Type":"ContainerDied","Data":"8b75043cd3fb50f0f4d747dd8ca8914186d73fadd142fd55ddb872745a3513e8"} Dec 04 10:03:55 crc kubenswrapper[4693]: I1204 10:03:55.409377 4693 generic.go:334] "Generic (PLEG): container finished" podID="4ba2460c-1d0a-4b7b-8159-5c94778aab54" containerID="0f8c449a5b8b7ad671abb92b5de3898b5c1062812568636c7b6e76bcce45212c" exitCode=0 Dec 04 10:03:55 crc kubenswrapper[4693]: I1204 10:03:55.409457 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ef17-account-create-update-wcgbx" event={"ID":"4ba2460c-1d0a-4b7b-8159-5c94778aab54","Type":"ContainerDied","Data":"0f8c449a5b8b7ad671abb92b5de3898b5c1062812568636c7b6e76bcce45212c"} Dec 04 10:03:55 crc kubenswrapper[4693]: I1204 10:03:55.412611 4693 generic.go:334] "Generic (PLEG): container finished" podID="55360827-2b39-4864-8d98-deb8d8fca9cc" containerID="c80f2f65e9a2b955f175661de526c8dc4736042f77434b1c2ccd6eea7df00287" exitCode=0 Dec 04 10:03:55 crc kubenswrapper[4693]: I1204 10:03:55.412653 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ae08-account-create-update-xbp2n" event={"ID":"55360827-2b39-4864-8d98-deb8d8fca9cc","Type":"ContainerDied","Data":"c80f2f65e9a2b955f175661de526c8dc4736042f77434b1c2ccd6eea7df00287"} Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.817296 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sk4bl" Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.822590 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0576-account-create-update-gqnh2" Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.842867 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4211-account-create-update-r4vb9" Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.858775 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-rc9st" Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.867595 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ae08-account-create-update-xbp2n" Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.884026 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ef17-account-create-update-wcgbx" Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.891251 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j5pgf" Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.972609 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c48aae9-77d8-4b25-989c-7b51c5938929-operator-scripts\") pod \"1c48aae9-77d8-4b25-989c-7b51c5938929\" (UID: \"1c48aae9-77d8-4b25-989c-7b51c5938929\") " Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.972680 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55360827-2b39-4864-8d98-deb8d8fca9cc-operator-scripts\") pod \"55360827-2b39-4864-8d98-deb8d8fca9cc\" (UID: \"55360827-2b39-4864-8d98-deb8d8fca9cc\") " Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.972726 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnq4n\" (UniqueName: \"kubernetes.io/projected/1c48aae9-77d8-4b25-989c-7b51c5938929-kube-api-access-gnq4n\") pod \"1c48aae9-77d8-4b25-989c-7b51c5938929\" (UID: \"1c48aae9-77d8-4b25-989c-7b51c5938929\") " Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.972760 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbvbn\" (UniqueName: \"kubernetes.io/projected/a392a0d0-239b-4be6-896c-3da1a401c361-kube-api-access-dbvbn\") pod \"a392a0d0-239b-4be6-896c-3da1a401c361\" (UID: \"a392a0d0-239b-4be6-896c-3da1a401c361\") " Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.972781 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqq8p\" (UniqueName: \"kubernetes.io/projected/55360827-2b39-4864-8d98-deb8d8fca9cc-kube-api-access-kqq8p\") pod \"55360827-2b39-4864-8d98-deb8d8fca9cc\" (UID: \"55360827-2b39-4864-8d98-deb8d8fca9cc\") " Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.972857 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a392a0d0-239b-4be6-896c-3da1a401c361-operator-scripts\") pod \"a392a0d0-239b-4be6-896c-3da1a401c361\" (UID: \"a392a0d0-239b-4be6-896c-3da1a401c361\") " Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.972886 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ed103ee-266a-4848-a410-2e01ea8f694a-operator-scripts\") pod \"7ed103ee-266a-4848-a410-2e01ea8f694a\" (UID: \"7ed103ee-266a-4848-a410-2e01ea8f694a\") " Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.973053 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6436522d-be9a-4cca-9928-8a001d22836e-operator-scripts\") pod \"6436522d-be9a-4cca-9928-8a001d22836e\" (UID: \"6436522d-be9a-4cca-9928-8a001d22836e\") " Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.973112 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r85w\" (UniqueName: \"kubernetes.io/projected/7ed103ee-266a-4848-a410-2e01ea8f694a-kube-api-access-9r85w\") pod \"7ed103ee-266a-4848-a410-2e01ea8f694a\" (UID: \"7ed103ee-266a-4848-a410-2e01ea8f694a\") " Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.973137 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2fdl\" (UniqueName: \"kubernetes.io/projected/6436522d-be9a-4cca-9928-8a001d22836e-kube-api-access-x2fdl\") pod \"6436522d-be9a-4cca-9928-8a001d22836e\" (UID: \"6436522d-be9a-4cca-9928-8a001d22836e\") " Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.973484 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c48aae9-77d8-4b25-989c-7b51c5938929-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c48aae9-77d8-4b25-989c-7b51c5938929" (UID: "1c48aae9-77d8-4b25-989c-7b51c5938929"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.973497 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55360827-2b39-4864-8d98-deb8d8fca9cc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55360827-2b39-4864-8d98-deb8d8fca9cc" (UID: "55360827-2b39-4864-8d98-deb8d8fca9cc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.973992 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6436522d-be9a-4cca-9928-8a001d22836e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6436522d-be9a-4cca-9928-8a001d22836e" (UID: "6436522d-be9a-4cca-9928-8a001d22836e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.974479 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ed103ee-266a-4848-a410-2e01ea8f694a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ed103ee-266a-4848-a410-2e01ea8f694a" (UID: "7ed103ee-266a-4848-a410-2e01ea8f694a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.974462 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a392a0d0-239b-4be6-896c-3da1a401c361-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a392a0d0-239b-4be6-896c-3da1a401c361" (UID: "a392a0d0-239b-4be6-896c-3da1a401c361"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.980040 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed103ee-266a-4848-a410-2e01ea8f694a-kube-api-access-9r85w" (OuterVolumeSpecName: "kube-api-access-9r85w") pod "7ed103ee-266a-4848-a410-2e01ea8f694a" (UID: "7ed103ee-266a-4848-a410-2e01ea8f694a"). InnerVolumeSpecName "kube-api-access-9r85w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.980093 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a392a0d0-239b-4be6-896c-3da1a401c361-kube-api-access-dbvbn" (OuterVolumeSpecName: "kube-api-access-dbvbn") pod "a392a0d0-239b-4be6-896c-3da1a401c361" (UID: "a392a0d0-239b-4be6-896c-3da1a401c361"). InnerVolumeSpecName "kube-api-access-dbvbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.980207 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6436522d-be9a-4cca-9928-8a001d22836e-kube-api-access-x2fdl" (OuterVolumeSpecName: "kube-api-access-x2fdl") pod "6436522d-be9a-4cca-9928-8a001d22836e" (UID: "6436522d-be9a-4cca-9928-8a001d22836e"). InnerVolumeSpecName "kube-api-access-x2fdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.981753 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55360827-2b39-4864-8d98-deb8d8fca9cc-kube-api-access-kqq8p" (OuterVolumeSpecName: "kube-api-access-kqq8p") pod "55360827-2b39-4864-8d98-deb8d8fca9cc" (UID: "55360827-2b39-4864-8d98-deb8d8fca9cc"). InnerVolumeSpecName "kube-api-access-kqq8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:00 crc kubenswrapper[4693]: I1204 10:04:00.984515 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c48aae9-77d8-4b25-989c-7b51c5938929-kube-api-access-gnq4n" (OuterVolumeSpecName: "kube-api-access-gnq4n") pod "1c48aae9-77d8-4b25-989c-7b51c5938929" (UID: "1c48aae9-77d8-4b25-989c-7b51c5938929"). InnerVolumeSpecName "kube-api-access-gnq4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.075868 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvnjj\" (UniqueName: \"kubernetes.io/projected/4ba2460c-1d0a-4b7b-8159-5c94778aab54-kube-api-access-tvnjj\") pod \"4ba2460c-1d0a-4b7b-8159-5c94778aab54\" (UID: \"4ba2460c-1d0a-4b7b-8159-5c94778aab54\") " Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.076168 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cafe793-8113-4ef4-ada0-5e699a68ea59-operator-scripts\") pod \"7cafe793-8113-4ef4-ada0-5e699a68ea59\" (UID: \"7cafe793-8113-4ef4-ada0-5e699a68ea59\") " Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.076242 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7f29\" (UniqueName: \"kubernetes.io/projected/7cafe793-8113-4ef4-ada0-5e699a68ea59-kube-api-access-z7f29\") pod \"7cafe793-8113-4ef4-ada0-5e699a68ea59\" (UID: \"7cafe793-8113-4ef4-ada0-5e699a68ea59\") " Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.076310 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ba2460c-1d0a-4b7b-8159-5c94778aab54-operator-scripts\") pod \"4ba2460c-1d0a-4b7b-8159-5c94778aab54\" (UID: \"4ba2460c-1d0a-4b7b-8159-5c94778aab54\") " Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.076814 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r85w\" (UniqueName: \"kubernetes.io/projected/7ed103ee-266a-4848-a410-2e01ea8f694a-kube-api-access-9r85w\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.076838 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2fdl\" (UniqueName: \"kubernetes.io/projected/6436522d-be9a-4cca-9928-8a001d22836e-kube-api-access-x2fdl\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.076852 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c48aae9-77d8-4b25-989c-7b51c5938929-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.076868 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55360827-2b39-4864-8d98-deb8d8fca9cc-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.076884 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnq4n\" (UniqueName: \"kubernetes.io/projected/1c48aae9-77d8-4b25-989c-7b51c5938929-kube-api-access-gnq4n\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.076874 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cafe793-8113-4ef4-ada0-5e699a68ea59-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7cafe793-8113-4ef4-ada0-5e699a68ea59" (UID: "7cafe793-8113-4ef4-ada0-5e699a68ea59"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.076896 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbvbn\" (UniqueName: \"kubernetes.io/projected/a392a0d0-239b-4be6-896c-3da1a401c361-kube-api-access-dbvbn\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.076944 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqq8p\" (UniqueName: \"kubernetes.io/projected/55360827-2b39-4864-8d98-deb8d8fca9cc-kube-api-access-kqq8p\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.076960 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a392a0d0-239b-4be6-896c-3da1a401c361-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.076973 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ed103ee-266a-4848-a410-2e01ea8f694a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.076986 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6436522d-be9a-4cca-9928-8a001d22836e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.077037 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ba2460c-1d0a-4b7b-8159-5c94778aab54-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4ba2460c-1d0a-4b7b-8159-5c94778aab54" (UID: "4ba2460c-1d0a-4b7b-8159-5c94778aab54"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.079509 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cafe793-8113-4ef4-ada0-5e699a68ea59-kube-api-access-z7f29" (OuterVolumeSpecName: "kube-api-access-z7f29") pod "7cafe793-8113-4ef4-ada0-5e699a68ea59" (UID: "7cafe793-8113-4ef4-ada0-5e699a68ea59"). InnerVolumeSpecName "kube-api-access-z7f29". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.079617 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ba2460c-1d0a-4b7b-8159-5c94778aab54-kube-api-access-tvnjj" (OuterVolumeSpecName: "kube-api-access-tvnjj") pod "4ba2460c-1d0a-4b7b-8159-5c94778aab54" (UID: "4ba2460c-1d0a-4b7b-8159-5c94778aab54"). InnerVolumeSpecName "kube-api-access-tvnjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.178399 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvnjj\" (UniqueName: \"kubernetes.io/projected/4ba2460c-1d0a-4b7b-8159-5c94778aab54-kube-api-access-tvnjj\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.178431 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cafe793-8113-4ef4-ada0-5e699a68ea59-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.178442 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7f29\" (UniqueName: \"kubernetes.io/projected/7cafe793-8113-4ef4-ada0-5e699a68ea59-kube-api-access-z7f29\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.178452 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ba2460c-1d0a-4b7b-8159-5c94778aab54-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.465490 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ae08-account-create-update-xbp2n" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.465533 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ae08-account-create-update-xbp2n" event={"ID":"55360827-2b39-4864-8d98-deb8d8fca9cc","Type":"ContainerDied","Data":"5feed33106b110d229c9c14edcd016b4ca8955082f774e091c80d9068904fe6e"} Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.465568 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5feed33106b110d229c9c14edcd016b4ca8955082f774e091c80d9068904fe6e" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.467030 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-rc9st" event={"ID":"6436522d-be9a-4cca-9928-8a001d22836e","Type":"ContainerDied","Data":"3a9471d0d8da468941af37506de58995cfef2164cbcd2cb980bb6fe36c318670"} Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.467045 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-rc9st" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.467059 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a9471d0d8da468941af37506de58995cfef2164cbcd2cb980bb6fe36c318670" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.469019 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sk4bl" event={"ID":"7ed103ee-266a-4848-a410-2e01ea8f694a","Type":"ContainerDied","Data":"bf415c7e35a7dc109bfa53e22cfc1e710384ba3004b4e8cdd6d9ae70722c387b"} Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.469064 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf415c7e35a7dc109bfa53e22cfc1e710384ba3004b4e8cdd6d9ae70722c387b" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.469042 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sk4bl" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.470374 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0576-account-create-update-gqnh2" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.470408 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0576-account-create-update-gqnh2" event={"ID":"1c48aae9-77d8-4b25-989c-7b51c5938929","Type":"ContainerDied","Data":"6a5e37ac4331c05338720461f1fbd704c3f4255fdde63db94a4eac86b7d09b7c"} Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.470471 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a5e37ac4331c05338720461f1fbd704c3f4255fdde63db94a4eac86b7d09b7c" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.471853 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4211-account-create-update-r4vb9" event={"ID":"a392a0d0-239b-4be6-896c-3da1a401c361","Type":"ContainerDied","Data":"eb2c8b1b658a301d84b8521ccb7716649ec41c9eaac915e1ac8e7d7339c18b69"} Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.471889 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb2c8b1b658a301d84b8521ccb7716649ec41c9eaac915e1ac8e7d7339c18b69" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.471856 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4211-account-create-update-r4vb9" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.473292 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-j5pgf" event={"ID":"7cafe793-8113-4ef4-ada0-5e699a68ea59","Type":"ContainerDied","Data":"66dfc5a204484f13d4ea273fb6741611aa1b5604fd1628eec97e9dc1eb637dc3"} Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.473356 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66dfc5a204484f13d4ea273fb6741611aa1b5604fd1628eec97e9dc1eb637dc3" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.473380 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j5pgf" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.474472 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ef17-account-create-update-wcgbx" event={"ID":"4ba2460c-1d0a-4b7b-8159-5c94778aab54","Type":"ContainerDied","Data":"c92b03865ae5ed73d9d6d04d27990161c0e4dc711e9b24f9f65c90570690c3a6"} Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.474583 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c92b03865ae5ed73d9d6d04d27990161c0e4dc711e9b24f9f65c90570690c3a6" Dec 04 10:04:01 crc kubenswrapper[4693]: I1204 10:04:01.474549 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ef17-account-create-update-wcgbx" Dec 04 10:04:09 crc kubenswrapper[4693]: I1204 10:04:09.526407 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:04:09 crc kubenswrapper[4693]: I1204 10:04:09.536255 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73554998-24a4-4d23-a78d-66d51cbe24af-etc-swift\") pod \"swift-storage-0\" (UID: \"73554998-24a4-4d23-a78d-66d51cbe24af\") " pod="openstack/swift-storage-0" Dec 04 10:04:09 crc kubenswrapper[4693]: E1204 10:04:09.538547 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 04 10:04:09 crc kubenswrapper[4693]: E1204 10:04:09.538754 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dr8d7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-9q6d6_openstack(6343580b-81bd-4993-a298-3b31730e6ae3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:04:09 crc kubenswrapper[4693]: E1204 10:04:09.540099 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-9q6d6" podUID="6343580b-81bd-4993-a298-3b31730e6ae3" Dec 04 10:04:09 crc kubenswrapper[4693]: I1204 10:04:09.542660 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 04 10:04:09 crc kubenswrapper[4693]: I1204 10:04:09.567304 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gxwt6" event={"ID":"d93060d5-ff5d-4f46-991d-b9b40c5d280d","Type":"ContainerDied","Data":"2fc781017589709b37face0663bc5eee93bd18cc938013abfe23baba3ff4ceb9"} Dec 04 10:04:09 crc kubenswrapper[4693]: I1204 10:04:09.567359 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fc781017589709b37face0663bc5eee93bd18cc938013abfe23baba3ff4ceb9" Dec 04 10:04:09 crc kubenswrapper[4693]: E1204 10:04:09.567897 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-9q6d6" podUID="6343580b-81bd-4993-a298-3b31730e6ae3" Dec 04 10:04:09 crc kubenswrapper[4693]: I1204 10:04:09.694533 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gxwt6" Dec 04 10:04:09 crc kubenswrapper[4693]: I1204 10:04:09.830842 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d93060d5-ff5d-4f46-991d-b9b40c5d280d-operator-scripts\") pod \"d93060d5-ff5d-4f46-991d-b9b40c5d280d\" (UID: \"d93060d5-ff5d-4f46-991d-b9b40c5d280d\") " Dec 04 10:04:09 crc kubenswrapper[4693]: I1204 10:04:09.831222 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rndv\" (UniqueName: \"kubernetes.io/projected/d93060d5-ff5d-4f46-991d-b9b40c5d280d-kube-api-access-6rndv\") pod \"d93060d5-ff5d-4f46-991d-b9b40c5d280d\" (UID: \"d93060d5-ff5d-4f46-991d-b9b40c5d280d\") " Dec 04 10:04:09 crc kubenswrapper[4693]: I1204 10:04:09.832766 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d93060d5-ff5d-4f46-991d-b9b40c5d280d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d93060d5-ff5d-4f46-991d-b9b40c5d280d" (UID: "d93060d5-ff5d-4f46-991d-b9b40c5d280d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:09 crc kubenswrapper[4693]: I1204 10:04:09.835830 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d93060d5-ff5d-4f46-991d-b9b40c5d280d-kube-api-access-6rndv" (OuterVolumeSpecName: "kube-api-access-6rndv") pod "d93060d5-ff5d-4f46-991d-b9b40c5d280d" (UID: "d93060d5-ff5d-4f46-991d-b9b40c5d280d"). InnerVolumeSpecName "kube-api-access-6rndv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:09 crc kubenswrapper[4693]: I1204 10:04:09.932325 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rndv\" (UniqueName: \"kubernetes.io/projected/d93060d5-ff5d-4f46-991d-b9b40c5d280d-kube-api-access-6rndv\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:09 crc kubenswrapper[4693]: I1204 10:04:09.932371 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d93060d5-ff5d-4f46-991d-b9b40c5d280d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:10 crc kubenswrapper[4693]: I1204 10:04:10.146698 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 04 10:04:10 crc kubenswrapper[4693]: I1204 10:04:10.574644 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4zr9z" event={"ID":"4e565feb-3867-434a-9b7e-9cae0a2f9152","Type":"ContainerStarted","Data":"510ab1cfe0a6633c3ddde08be3570e8036917a89eebd96b3ae933a4e05c57267"} Dec 04 10:04:10 crc kubenswrapper[4693]: I1204 10:04:10.576361 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73554998-24a4-4d23-a78d-66d51cbe24af","Type":"ContainerStarted","Data":"4a85ad93046e31cb5602dcb6c89ac434d1e483f09a3493b342b479f84e7d113c"} Dec 04 10:04:10 crc kubenswrapper[4693]: I1204 10:04:10.576381 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gxwt6" Dec 04 10:04:10 crc kubenswrapper[4693]: I1204 10:04:10.592824 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-4zr9z" podStartSLOduration=2.442459573 podStartE2EDuration="18.592804966s" podCreationTimestamp="2025-12-04 10:03:52 +0000 UTC" firstStartedPulling="2025-12-04 10:03:53.388043014 +0000 UTC m=+1279.285636767" lastFinishedPulling="2025-12-04 10:04:09.538388417 +0000 UTC m=+1295.435982160" observedRunningTime="2025-12-04 10:04:10.591013807 +0000 UTC m=+1296.488607560" watchObservedRunningTime="2025-12-04 10:04:10.592804966 +0000 UTC m=+1296.490398719" Dec 04 10:04:13 crc kubenswrapper[4693]: I1204 10:04:13.611932 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73554998-24a4-4d23-a78d-66d51cbe24af","Type":"ContainerStarted","Data":"2ca3d06d0bac341c631b955a3ec8507562a022717a395e6aa50427e5cdd7ed43"} Dec 04 10:04:14 crc kubenswrapper[4693]: I1204 10:04:14.621835 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73554998-24a4-4d23-a78d-66d51cbe24af","Type":"ContainerStarted","Data":"64389e872bb61e102c927e148563cf5a801dbfce7e9df839535e99e41bb780ac"} Dec 04 10:04:14 crc kubenswrapper[4693]: I1204 10:04:14.622193 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73554998-24a4-4d23-a78d-66d51cbe24af","Type":"ContainerStarted","Data":"147aa2ece59f658389393c1379c0e912abc1fb9db2050d21a126741b843bdfc5"} Dec 04 10:04:15 crc kubenswrapper[4693]: I1204 10:04:15.632886 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73554998-24a4-4d23-a78d-66d51cbe24af","Type":"ContainerStarted","Data":"7fed65a3e25696a5e7bc6844ecdac0559fd0437e7ba82148e7054f68955159fc"} Dec 04 10:04:17 crc kubenswrapper[4693]: I1204 10:04:17.656074 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73554998-24a4-4d23-a78d-66d51cbe24af","Type":"ContainerStarted","Data":"686316aaa0f1e9937d31d0343baf3dbca9469d27569bd5c9fd9d0ed33d8a8a03"} Dec 04 10:04:17 crc kubenswrapper[4693]: I1204 10:04:17.656124 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73554998-24a4-4d23-a78d-66d51cbe24af","Type":"ContainerStarted","Data":"5b4b3120e1c31f5bf45d5cc379b81d3c1d5f614a38d2f521c880b0cec0107ff0"} Dec 04 10:04:18 crc kubenswrapper[4693]: I1204 10:04:18.668573 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73554998-24a4-4d23-a78d-66d51cbe24af","Type":"ContainerStarted","Data":"1142774394b39d70695020e5185ba6b88af110fadd3d8d07d743649796f54318"} Dec 04 10:04:19 crc kubenswrapper[4693]: I1204 10:04:19.680836 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73554998-24a4-4d23-a78d-66d51cbe24af","Type":"ContainerStarted","Data":"d3cf519d6c23b589d2a5d7b207576d37ab526fdfe307af6dc3d0be46f6bbeaf5"} Dec 04 10:04:22 crc kubenswrapper[4693]: I1204 10:04:22.703807 4693 generic.go:334] "Generic (PLEG): container finished" podID="4e565feb-3867-434a-9b7e-9cae0a2f9152" containerID="510ab1cfe0a6633c3ddde08be3570e8036917a89eebd96b3ae933a4e05c57267" exitCode=0 Dec 04 10:04:22 crc kubenswrapper[4693]: I1204 10:04:22.704014 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4zr9z" event={"ID":"4e565feb-3867-434a-9b7e-9cae0a2f9152","Type":"ContainerDied","Data":"510ab1cfe0a6633c3ddde08be3570e8036917a89eebd96b3ae933a4e05c57267"} Dec 04 10:04:22 crc kubenswrapper[4693]: I1204 10:04:22.728031 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73554998-24a4-4d23-a78d-66d51cbe24af","Type":"ContainerStarted","Data":"a3e2fbb2c2015598238bd8e15b2e26b16956198afcce4a85dca05175688973d8"} Dec 04 10:04:22 crc kubenswrapper[4693]: I1204 10:04:22.728070 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73554998-24a4-4d23-a78d-66d51cbe24af","Type":"ContainerStarted","Data":"3cefd429eb75c3525ae70363bbf313374c6148403f4cf8693d47f8d84a9707d1"} Dec 04 10:04:23 crc kubenswrapper[4693]: I1204 10:04:23.743591 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73554998-24a4-4d23-a78d-66d51cbe24af","Type":"ContainerStarted","Data":"6526ac88976a96b8dad9c7f5b33d21cbc3c46f78db351ce656eb32016282f5c4"} Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.067158 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4zr9z" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.263596 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx4qn\" (UniqueName: \"kubernetes.io/projected/4e565feb-3867-434a-9b7e-9cae0a2f9152-kube-api-access-qx4qn\") pod \"4e565feb-3867-434a-9b7e-9cae0a2f9152\" (UID: \"4e565feb-3867-434a-9b7e-9cae0a2f9152\") " Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.264154 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e565feb-3867-434a-9b7e-9cae0a2f9152-config-data\") pod \"4e565feb-3867-434a-9b7e-9cae0a2f9152\" (UID: \"4e565feb-3867-434a-9b7e-9cae0a2f9152\") " Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.264225 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e565feb-3867-434a-9b7e-9cae0a2f9152-combined-ca-bundle\") pod \"4e565feb-3867-434a-9b7e-9cae0a2f9152\" (UID: \"4e565feb-3867-434a-9b7e-9cae0a2f9152\") " Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.274258 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e565feb-3867-434a-9b7e-9cae0a2f9152-kube-api-access-qx4qn" (OuterVolumeSpecName: "kube-api-access-qx4qn") pod "4e565feb-3867-434a-9b7e-9cae0a2f9152" (UID: "4e565feb-3867-434a-9b7e-9cae0a2f9152"). InnerVolumeSpecName "kube-api-access-qx4qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.293608 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e565feb-3867-434a-9b7e-9cae0a2f9152-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e565feb-3867-434a-9b7e-9cae0a2f9152" (UID: "4e565feb-3867-434a-9b7e-9cae0a2f9152"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.323363 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e565feb-3867-434a-9b7e-9cae0a2f9152-config-data" (OuterVolumeSpecName: "config-data") pod "4e565feb-3867-434a-9b7e-9cae0a2f9152" (UID: "4e565feb-3867-434a-9b7e-9cae0a2f9152"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.367761 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx4qn\" (UniqueName: \"kubernetes.io/projected/4e565feb-3867-434a-9b7e-9cae0a2f9152-kube-api-access-qx4qn\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.367810 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e565feb-3867-434a-9b7e-9cae0a2f9152-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.367823 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e565feb-3867-434a-9b7e-9cae0a2f9152-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.764938 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4zr9z" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.764939 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4zr9z" event={"ID":"4e565feb-3867-434a-9b7e-9cae0a2f9152","Type":"ContainerDied","Data":"938237148dc655395c953c070bce8359a52d62610c18da16db76142037d9d7da"} Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.765003 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="938237148dc655395c953c070bce8359a52d62610c18da16db76142037d9d7da" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.773590 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73554998-24a4-4d23-a78d-66d51cbe24af","Type":"ContainerStarted","Data":"47e48229295b660dfc273c74645cbf3e97b5e2e3ff662f6c92adfd48d356935d"} Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.773651 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73554998-24a4-4d23-a78d-66d51cbe24af","Type":"ContainerStarted","Data":"38b52aec86f7b51370bb4c9cba3ca2309e6acc655741580bcc763f6f16030c18"} Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.773668 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73554998-24a4-4d23-a78d-66d51cbe24af","Type":"ContainerStarted","Data":"85b0c5b38ccf7005b1e95176175aafb65d632eb48ae4b8b49ebf075df26d4dc4"} Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.982688 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-wq9jf"] Dec 04 10:04:24 crc kubenswrapper[4693]: E1204 10:04:24.983295 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cafe793-8113-4ef4-ada0-5e699a68ea59" containerName="mariadb-database-create" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.983314 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cafe793-8113-4ef4-ada0-5e699a68ea59" containerName="mariadb-database-create" Dec 04 10:04:24 crc kubenswrapper[4693]: E1204 10:04:24.983349 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed103ee-266a-4848-a410-2e01ea8f694a" containerName="mariadb-database-create" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.983356 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed103ee-266a-4848-a410-2e01ea8f694a" containerName="mariadb-database-create" Dec 04 10:04:24 crc kubenswrapper[4693]: E1204 10:04:24.983365 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a392a0d0-239b-4be6-896c-3da1a401c361" containerName="mariadb-account-create-update" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.983372 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a392a0d0-239b-4be6-896c-3da1a401c361" containerName="mariadb-account-create-update" Dec 04 10:04:24 crc kubenswrapper[4693]: E1204 10:04:24.983384 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6436522d-be9a-4cca-9928-8a001d22836e" containerName="mariadb-database-create" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.983390 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6436522d-be9a-4cca-9928-8a001d22836e" containerName="mariadb-database-create" Dec 04 10:04:24 crc kubenswrapper[4693]: E1204 10:04:24.983400 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ba2460c-1d0a-4b7b-8159-5c94778aab54" containerName="mariadb-account-create-update" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.983406 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba2460c-1d0a-4b7b-8159-5c94778aab54" containerName="mariadb-account-create-update" Dec 04 10:04:24 crc kubenswrapper[4693]: E1204 10:04:24.983414 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c48aae9-77d8-4b25-989c-7b51c5938929" containerName="mariadb-account-create-update" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.983420 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c48aae9-77d8-4b25-989c-7b51c5938929" containerName="mariadb-account-create-update" Dec 04 10:04:24 crc kubenswrapper[4693]: E1204 10:04:24.983431 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e565feb-3867-434a-9b7e-9cae0a2f9152" containerName="keystone-db-sync" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.983438 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e565feb-3867-434a-9b7e-9cae0a2f9152" containerName="keystone-db-sync" Dec 04 10:04:24 crc kubenswrapper[4693]: E1204 10:04:24.983447 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d93060d5-ff5d-4f46-991d-b9b40c5d280d" containerName="mariadb-database-create" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.983454 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d93060d5-ff5d-4f46-991d-b9b40c5d280d" containerName="mariadb-database-create" Dec 04 10:04:24 crc kubenswrapper[4693]: E1204 10:04:24.983466 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55360827-2b39-4864-8d98-deb8d8fca9cc" containerName="mariadb-account-create-update" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.983472 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="55360827-2b39-4864-8d98-deb8d8fca9cc" containerName="mariadb-account-create-update" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.983615 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cafe793-8113-4ef4-ada0-5e699a68ea59" containerName="mariadb-database-create" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.983635 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d93060d5-ff5d-4f46-991d-b9b40c5d280d" containerName="mariadb-database-create" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.983649 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a392a0d0-239b-4be6-896c-3da1a401c361" containerName="mariadb-account-create-update" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.983659 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e565feb-3867-434a-9b7e-9cae0a2f9152" containerName="keystone-db-sync" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.983669 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6436522d-be9a-4cca-9928-8a001d22836e" containerName="mariadb-database-create" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.983681 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="55360827-2b39-4864-8d98-deb8d8fca9cc" containerName="mariadb-account-create-update" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.983693 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c48aae9-77d8-4b25-989c-7b51c5938929" containerName="mariadb-account-create-update" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.983707 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed103ee-266a-4848-a410-2e01ea8f694a" containerName="mariadb-database-create" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.983717 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ba2460c-1d0a-4b7b-8159-5c94778aab54" containerName="mariadb-account-create-update" Dec 04 10:04:24 crc kubenswrapper[4693]: I1204 10:04:24.985103 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.020110 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-wq9jf"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.049407 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rq2zb"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.050515 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.060951 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.061236 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.061421 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.061678 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.071311 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9fq9s" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.082812 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rq2zb"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.090620 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-wq9jf\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.091102 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzf6g\" (UniqueName: \"kubernetes.io/projected/80436510-952e-4346-9c8a-3dbd090866eb-kube-api-access-tzf6g\") pod \"dnsmasq-dns-5c9d85d47c-wq9jf\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.091201 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-wq9jf\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.091230 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-wq9jf\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.091281 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-config\") pod \"dnsmasq-dns-5c9d85d47c-wq9jf\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.193816 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-wq9jf\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.193861 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-wq9jf\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.193881 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-config-data\") pod \"keystone-bootstrap-rq2zb\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.193923 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-config\") pod \"dnsmasq-dns-5c9d85d47c-wq9jf\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.193949 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-combined-ca-bundle\") pod \"keystone-bootstrap-rq2zb\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.193975 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-scripts\") pod \"keystone-bootstrap-rq2zb\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.194041 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-wq9jf\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.194055 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-credential-keys\") pod \"keystone-bootstrap-rq2zb\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.194075 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzf6g\" (UniqueName: \"kubernetes.io/projected/80436510-952e-4346-9c8a-3dbd090866eb-kube-api-access-tzf6g\") pod \"dnsmasq-dns-5c9d85d47c-wq9jf\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.194101 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht7jj\" (UniqueName: \"kubernetes.io/projected/953c5f43-bffd-4b82-9693-e0809449dc30-kube-api-access-ht7jj\") pod \"keystone-bootstrap-rq2zb\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.194121 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-fernet-keys\") pod \"keystone-bootstrap-rq2zb\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.195021 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-wq9jf\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.195635 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-wq9jf\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.196228 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-wq9jf\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.196511 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-config\") pod \"dnsmasq-dns-5c9d85d47c-wq9jf\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.206730 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-fc99g"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.207993 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fc99g" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.212791 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.213144 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vbdj5" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.213363 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.220508 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f7f97fc4f-7jrw9"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.227559 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f7f97fc4f-7jrw9" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.231664 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.231930 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.232066 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-sgvjs" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.232213 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.232460 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fc99g"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.241372 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f7f97fc4f-7jrw9"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.261946 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzf6g\" (UniqueName: \"kubernetes.io/projected/80436510-952e-4346-9c8a-3dbd090866eb-kube-api-access-tzf6g\") pod \"dnsmasq-dns-5c9d85d47c-wq9jf\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.303465 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-credential-keys\") pod \"keystone-bootstrap-rq2zb\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.303529 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht7jj\" (UniqueName: \"kubernetes.io/projected/953c5f43-bffd-4b82-9693-e0809449dc30-kube-api-access-ht7jj\") pod \"keystone-bootstrap-rq2zb\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.303557 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-fernet-keys\") pod \"keystone-bootstrap-rq2zb\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.303593 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228babee-6748-4512-bd76-92168eab2e2d-combined-ca-bundle\") pod \"neutron-db-sync-fc99g\" (UID: \"228babee-6748-4512-bd76-92168eab2e2d\") " pod="openstack/neutron-db-sync-fc99g" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.303676 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-config-data\") pod \"keystone-bootstrap-rq2zb\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.303743 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/228babee-6748-4512-bd76-92168eab2e2d-config\") pod \"neutron-db-sync-fc99g\" (UID: \"228babee-6748-4512-bd76-92168eab2e2d\") " pod="openstack/neutron-db-sync-fc99g" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.303773 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-combined-ca-bundle\") pod \"keystone-bootstrap-rq2zb\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.303804 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-scripts\") pod \"keystone-bootstrap-rq2zb\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.303826 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjlpg\" (UniqueName: \"kubernetes.io/projected/228babee-6748-4512-bd76-92168eab2e2d-kube-api-access-vjlpg\") pod \"neutron-db-sync-fc99g\" (UID: \"228babee-6748-4512-bd76-92168eab2e2d\") " pod="openstack/neutron-db-sync-fc99g" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.325196 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.328419 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-combined-ca-bundle\") pod \"keystone-bootstrap-rq2zb\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.329339 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.336039 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.337289 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-fernet-keys\") pod \"keystone-bootstrap-rq2zb\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.340033 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.343566 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-scripts\") pod \"keystone-bootstrap-rq2zb\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.377055 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-credential-keys\") pod \"keystone-bootstrap-rq2zb\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.380163 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-config-data\") pod \"keystone-bootstrap-rq2zb\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.390851 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht7jj\" (UniqueName: \"kubernetes.io/projected/953c5f43-bffd-4b82-9693-e0809449dc30-kube-api-access-ht7jj\") pod \"keystone-bootstrap-rq2zb\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.408031 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjlpg\" (UniqueName: \"kubernetes.io/projected/228babee-6748-4512-bd76-92168eab2e2d-kube-api-access-vjlpg\") pod \"neutron-db-sync-fc99g\" (UID: \"228babee-6748-4512-bd76-92168eab2e2d\") " pod="openstack/neutron-db-sync-fc99g" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.408122 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9nh8\" (UniqueName: \"kubernetes.io/projected/8e9c528f-c36e-4182-9e00-1aca67f3d19c-kube-api-access-q9nh8\") pod \"horizon-6f7f97fc4f-7jrw9\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " pod="openstack/horizon-6f7f97fc4f-7jrw9" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.408227 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e9c528f-c36e-4182-9e00-1aca67f3d19c-scripts\") pod \"horizon-6f7f97fc4f-7jrw9\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " pod="openstack/horizon-6f7f97fc4f-7jrw9" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.408261 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e9c528f-c36e-4182-9e00-1aca67f3d19c-config-data\") pod \"horizon-6f7f97fc4f-7jrw9\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " pod="openstack/horizon-6f7f97fc4f-7jrw9" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.408291 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8e9c528f-c36e-4182-9e00-1aca67f3d19c-horizon-secret-key\") pod \"horizon-6f7f97fc4f-7jrw9\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " pod="openstack/horizon-6f7f97fc4f-7jrw9" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.408462 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228babee-6748-4512-bd76-92168eab2e2d-combined-ca-bundle\") pod \"neutron-db-sync-fc99g\" (UID: \"228babee-6748-4512-bd76-92168eab2e2d\") " pod="openstack/neutron-db-sync-fc99g" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.408555 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e9c528f-c36e-4182-9e00-1aca67f3d19c-logs\") pod \"horizon-6f7f97fc4f-7jrw9\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " pod="openstack/horizon-6f7f97fc4f-7jrw9" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.408586 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/228babee-6748-4512-bd76-92168eab2e2d-config\") pod \"neutron-db-sync-fc99g\" (UID: \"228babee-6748-4512-bd76-92168eab2e2d\") " pod="openstack/neutron-db-sync-fc99g" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.412211 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.426055 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228babee-6748-4512-bd76-92168eab2e2d-combined-ca-bundle\") pod \"neutron-db-sync-fc99g\" (UID: \"228babee-6748-4512-bd76-92168eab2e2d\") " pod="openstack/neutron-db-sync-fc99g" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.426674 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.431979 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.446424 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/228babee-6748-4512-bd76-92168eab2e2d-config\") pod \"neutron-db-sync-fc99g\" (UID: \"228babee-6748-4512-bd76-92168eab2e2d\") " pod="openstack/neutron-db-sync-fc99g" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.447062 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjlpg\" (UniqueName: \"kubernetes.io/projected/228babee-6748-4512-bd76-92168eab2e2d-kube-api-access-vjlpg\") pod \"neutron-db-sync-fc99g\" (UID: \"228babee-6748-4512-bd76-92168eab2e2d\") " pod="openstack/neutron-db-sync-fc99g" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.486679 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-q642m"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.487968 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.507483 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.507634 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-lvx99"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.508306 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.508544 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fg25t" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.508719 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lvx99" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.509592 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-config-data\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.509689 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.509737 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e9c528f-c36e-4182-9e00-1aca67f3d19c-logs\") pod \"horizon-6f7f97fc4f-7jrw9\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " pod="openstack/horizon-6f7f97fc4f-7jrw9" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.509792 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.509976 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz5xm\" (UniqueName: \"kubernetes.io/projected/edf292dd-e2f4-4e80-ab9a-3e548118489a-kube-api-access-pz5xm\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.510063 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9nh8\" (UniqueName: \"kubernetes.io/projected/8e9c528f-c36e-4182-9e00-1aca67f3d19c-kube-api-access-q9nh8\") pod \"horizon-6f7f97fc4f-7jrw9\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " pod="openstack/horizon-6f7f97fc4f-7jrw9" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.526740 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edf292dd-e2f4-4e80-ab9a-3e548118489a-log-httpd\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.526815 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-scripts\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.526843 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e9c528f-c36e-4182-9e00-1aca67f3d19c-scripts\") pod \"horizon-6f7f97fc4f-7jrw9\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " pod="openstack/horizon-6f7f97fc4f-7jrw9" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.526868 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edf292dd-e2f4-4e80-ab9a-3e548118489a-run-httpd\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.526901 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e9c528f-c36e-4182-9e00-1aca67f3d19c-config-data\") pod \"horizon-6f7f97fc4f-7jrw9\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " pod="openstack/horizon-6f7f97fc4f-7jrw9" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.526954 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8e9c528f-c36e-4182-9e00-1aca67f3d19c-horizon-secret-key\") pod \"horizon-6f7f97fc4f-7jrw9\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " pod="openstack/horizon-6f7f97fc4f-7jrw9" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.511722 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e9c528f-c36e-4182-9e00-1aca67f3d19c-logs\") pod \"horizon-6f7f97fc4f-7jrw9\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " pod="openstack/horizon-6f7f97fc4f-7jrw9" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.519092 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.521444 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-q642m"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.519206 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-dbp65" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.547249 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fc99g" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.548194 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e9c528f-c36e-4182-9e00-1aca67f3d19c-scripts\") pod \"horizon-6f7f97fc4f-7jrw9\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " pod="openstack/horizon-6f7f97fc4f-7jrw9" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.549376 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8e9c528f-c36e-4182-9e00-1aca67f3d19c-horizon-secret-key\") pod \"horizon-6f7f97fc4f-7jrw9\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " pod="openstack/horizon-6f7f97fc4f-7jrw9" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.555282 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e9c528f-c36e-4182-9e00-1aca67f3d19c-config-data\") pod \"horizon-6f7f97fc4f-7jrw9\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " pod="openstack/horizon-6f7f97fc4f-7jrw9" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.575141 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9nh8\" (UniqueName: \"kubernetes.io/projected/8e9c528f-c36e-4182-9e00-1aca67f3d19c-kube-api-access-q9nh8\") pod \"horizon-6f7f97fc4f-7jrw9\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " pod="openstack/horizon-6f7f97fc4f-7jrw9" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.603129 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-lvx99"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.610635 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f7f97fc4f-7jrw9" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.629432 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-config-data\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.629530 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-combined-ca-bundle\") pod \"manila-db-sync-lvx99\" (UID: \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\") " pod="openstack/manila-db-sync-lvx99" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.629628 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbh6c\" (UniqueName: \"kubernetes.io/projected/832e603c-b695-442e-bcf6-fa322cfc1524-kube-api-access-nbh6c\") pod \"cinder-db-sync-q642m\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.629658 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.629722 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/832e603c-b695-442e-bcf6-fa322cfc1524-etc-machine-id\") pod \"cinder-db-sync-q642m\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.629757 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.629825 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz5xm\" (UniqueName: \"kubernetes.io/projected/edf292dd-e2f4-4e80-ab9a-3e548118489a-kube-api-access-pz5xm\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.631199 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-scripts\") pod \"cinder-db-sync-q642m\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.631259 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-combined-ca-bundle\") pod \"cinder-db-sync-q642m\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.631284 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-config-data\") pod \"manila-db-sync-lvx99\" (UID: \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\") " pod="openstack/manila-db-sync-lvx99" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.631349 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-db-sync-config-data\") pod \"cinder-db-sync-q642m\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.631384 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edf292dd-e2f4-4e80-ab9a-3e548118489a-log-httpd\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.631429 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-job-config-data\") pod \"manila-db-sync-lvx99\" (UID: \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\") " pod="openstack/manila-db-sync-lvx99" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.631472 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-scripts\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.631502 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcqzj\" (UniqueName: \"kubernetes.io/projected/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-kube-api-access-bcqzj\") pod \"manila-db-sync-lvx99\" (UID: \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\") " pod="openstack/manila-db-sync-lvx99" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.631531 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edf292dd-e2f4-4e80-ab9a-3e548118489a-run-httpd\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.631570 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-config-data\") pod \"cinder-db-sync-q642m\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.633922 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-wq9jf"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.634642 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edf292dd-e2f4-4e80-ab9a-3e548118489a-log-httpd\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.635141 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edf292dd-e2f4-4e80-ab9a-3e548118489a-run-httpd\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.639081 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-scripts\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.650483 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-config-data\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.653444 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.654099 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.660465 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-srzft"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.673796 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-srzft" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.696009 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-k954f" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.696292 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.696794 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.701119 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-rztls"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.701500 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz5xm\" (UniqueName: \"kubernetes.io/projected/edf292dd-e2f4-4e80-ab9a-3e548118489a-kube-api-access-pz5xm\") pod \"ceilometer-0\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.702615 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.729007 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-srzft"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.733401 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-config-data\") pod \"cinder-db-sync-q642m\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.734073 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-combined-ca-bundle\") pod \"manila-db-sync-lvx99\" (UID: \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\") " pod="openstack/manila-db-sync-lvx99" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.734645 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbh6c\" (UniqueName: \"kubernetes.io/projected/832e603c-b695-442e-bcf6-fa322cfc1524-kube-api-access-nbh6c\") pod \"cinder-db-sync-q642m\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.734691 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/832e603c-b695-442e-bcf6-fa322cfc1524-etc-machine-id\") pod \"cinder-db-sync-q642m\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.734746 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-scripts\") pod \"cinder-db-sync-q642m\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.734766 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-combined-ca-bundle\") pod \"cinder-db-sync-q642m\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.734784 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-config-data\") pod \"manila-db-sync-lvx99\" (UID: \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\") " pod="openstack/manila-db-sync-lvx99" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.734812 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-db-sync-config-data\") pod \"cinder-db-sync-q642m\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.734843 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-job-config-data\") pod \"manila-db-sync-lvx99\" (UID: \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\") " pod="openstack/manila-db-sync-lvx99" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.734863 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcqzj\" (UniqueName: \"kubernetes.io/projected/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-kube-api-access-bcqzj\") pod \"manila-db-sync-lvx99\" (UID: \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\") " pod="openstack/manila-db-sync-lvx99" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.739936 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/832e603c-b695-442e-bcf6-fa322cfc1524-etc-machine-id\") pod \"cinder-db-sync-q642m\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.753782 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-547d7c7775-rc6qr"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.757147 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-547d7c7775-rc6qr" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.761911 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-rztls"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.766376 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-combined-ca-bundle\") pod \"manila-db-sync-lvx99\" (UID: \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\") " pod="openstack/manila-db-sync-lvx99" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.766532 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-config-data\") pod \"manila-db-sync-lvx99\" (UID: \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\") " pod="openstack/manila-db-sync-lvx99" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.771805 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-p69cr"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.772901 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-scripts\") pod \"cinder-db-sync-q642m\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.773416 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcqzj\" (UniqueName: \"kubernetes.io/projected/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-kube-api-access-bcqzj\") pod \"manila-db-sync-lvx99\" (UID: \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\") " pod="openstack/manila-db-sync-lvx99" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.774361 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-p69cr" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.777044 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.777425 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-job-config-data\") pod \"manila-db-sync-lvx99\" (UID: \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\") " pod="openstack/manila-db-sync-lvx99" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.778719 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-combined-ca-bundle\") pod \"cinder-db-sync-q642m\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.779466 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-db-sync-config-data\") pod \"cinder-db-sync-q642m\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.779677 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-config-data\") pod \"cinder-db-sync-q642m\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.781215 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qc2xk" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.801575 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbh6c\" (UniqueName: \"kubernetes.io/projected/832e603c-b695-442e-bcf6-fa322cfc1524-kube-api-access-nbh6c\") pod \"cinder-db-sync-q642m\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.811039 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.825240 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9q6d6" event={"ID":"6343580b-81bd-4993-a298-3b31730e6ae3","Type":"ContainerStarted","Data":"fbf61924d8fb146210c44e926db797ef98bfd70f46df952c59704501d3c5dbb2"} Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.827853 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-547d7c7775-rc6qr"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.838130 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-config\") pod \"dnsmasq-dns-6ffb94d8ff-rztls\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.838191 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af7a03e1-cb13-4536-9405-791381101cdc-config-data\") pod \"placement-db-sync-srzft\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " pod="openstack/placement-db-sync-srzft" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.838217 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-dns-svc\") pod \"dnsmasq-dns-6ffb94d8ff-rztls\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.838233 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffb94d8ff-rztls\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.838268 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w9v8\" (UniqueName: \"kubernetes.io/projected/11044bb2-8f41-45cd-a9c6-dc43709966a6-kube-api-access-7w9v8\") pod \"dnsmasq-dns-6ffb94d8ff-rztls\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.838296 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af7a03e1-cb13-4536-9405-791381101cdc-combined-ca-bundle\") pod \"placement-db-sync-srzft\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " pod="openstack/placement-db-sync-srzft" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.838361 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af7a03e1-cb13-4536-9405-791381101cdc-scripts\") pod \"placement-db-sync-srzft\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " pod="openstack/placement-db-sync-srzft" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.838384 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af7a03e1-cb13-4536-9405-791381101cdc-logs\") pod \"placement-db-sync-srzft\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " pod="openstack/placement-db-sync-srzft" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.838404 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n9qq\" (UniqueName: \"kubernetes.io/projected/af7a03e1-cb13-4536-9405-791381101cdc-kube-api-access-5n9qq\") pod \"placement-db-sync-srzft\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " pod="openstack/placement-db-sync-srzft" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.838419 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffb94d8ff-rztls\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.847686 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-p69cr"] Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.859286 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-q642m" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.878146 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73554998-24a4-4d23-a78d-66d51cbe24af","Type":"ContainerStarted","Data":"4cc00fb39259c5463653b73cb76ff77695acf0214bd88fe3b35ed93f3f2e9ce1"} Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.881544 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lvx99" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.932984 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-9q6d6" podStartSLOduration=3.717291603 podStartE2EDuration="38.932953692s" podCreationTimestamp="2025-12-04 10:03:47 +0000 UTC" firstStartedPulling="2025-12-04 10:03:49.187165321 +0000 UTC m=+1275.084759074" lastFinishedPulling="2025-12-04 10:04:24.40282741 +0000 UTC m=+1310.300421163" observedRunningTime="2025-12-04 10:04:25.852546552 +0000 UTC m=+1311.750140305" watchObservedRunningTime="2025-12-04 10:04:25.932953692 +0000 UTC m=+1311.830547445" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.939613 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=70.363551136 podStartE2EDuration="1m21.939591605s" podCreationTimestamp="2025-12-04 10:03:04 +0000 UTC" firstStartedPulling="2025-12-04 10:04:10.149052756 +0000 UTC m=+1296.046646509" lastFinishedPulling="2025-12-04 10:04:21.725093225 +0000 UTC m=+1307.622686978" observedRunningTime="2025-12-04 10:04:25.923483441 +0000 UTC m=+1311.821077194" watchObservedRunningTime="2025-12-04 10:04:25.939591605 +0000 UTC m=+1311.837185358" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.940275 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62cb864-103c-4b89-afeb-8397af4046cb-combined-ca-bundle\") pod \"barbican-db-sync-p69cr\" (UID: \"a62cb864-103c-4b89-afeb-8397af4046cb\") " pod="openstack/barbican-db-sync-p69cr" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.940347 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af7a03e1-cb13-4536-9405-791381101cdc-config-data\") pod \"placement-db-sync-srzft\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " pod="openstack/placement-db-sync-srzft" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.940379 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-dns-svc\") pod \"dnsmasq-dns-6ffb94d8ff-rztls\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.940395 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffb94d8ff-rztls\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.940419 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w9v8\" (UniqueName: \"kubernetes.io/projected/11044bb2-8f41-45cd-a9c6-dc43709966a6-kube-api-access-7w9v8\") pod \"dnsmasq-dns-6ffb94d8ff-rztls\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.940462 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af7a03e1-cb13-4536-9405-791381101cdc-combined-ca-bundle\") pod \"placement-db-sync-srzft\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " pod="openstack/placement-db-sync-srzft" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.940499 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tmz2\" (UniqueName: \"kubernetes.io/projected/a62cb864-103c-4b89-afeb-8397af4046cb-kube-api-access-6tmz2\") pod \"barbican-db-sync-p69cr\" (UID: \"a62cb864-103c-4b89-afeb-8397af4046cb\") " pod="openstack/barbican-db-sync-p69cr" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.940535 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af7a03e1-cb13-4536-9405-791381101cdc-scripts\") pod \"placement-db-sync-srzft\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " pod="openstack/placement-db-sync-srzft" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.940559 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af7a03e1-cb13-4536-9405-791381101cdc-logs\") pod \"placement-db-sync-srzft\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " pod="openstack/placement-db-sync-srzft" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.940577 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n9qq\" (UniqueName: \"kubernetes.io/projected/af7a03e1-cb13-4536-9405-791381101cdc-kube-api-access-5n9qq\") pod \"placement-db-sync-srzft\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " pod="openstack/placement-db-sync-srzft" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.940597 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffb94d8ff-rztls\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.941772 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-dns-svc\") pod \"dnsmasq-dns-6ffb94d8ff-rztls\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.944783 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffb94d8ff-rztls\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.946557 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af7a03e1-cb13-4536-9405-791381101cdc-logs\") pod \"placement-db-sync-srzft\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " pod="openstack/placement-db-sync-srzft" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.946696 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/16682812-65a8-4824-a8ae-f3a0b9932ccd-horizon-secret-key\") pod \"horizon-547d7c7775-rc6qr\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " pod="openstack/horizon-547d7c7775-rc6qr" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.946739 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16682812-65a8-4824-a8ae-f3a0b9932ccd-logs\") pod \"horizon-547d7c7775-rc6qr\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " pod="openstack/horizon-547d7c7775-rc6qr" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.946781 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6w7s\" (UniqueName: \"kubernetes.io/projected/16682812-65a8-4824-a8ae-f3a0b9932ccd-kube-api-access-l6w7s\") pod \"horizon-547d7c7775-rc6qr\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " pod="openstack/horizon-547d7c7775-rc6qr" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.946833 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16682812-65a8-4824-a8ae-f3a0b9932ccd-config-data\") pod \"horizon-547d7c7775-rc6qr\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " pod="openstack/horizon-547d7c7775-rc6qr" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.946912 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/16682812-65a8-4824-a8ae-f3a0b9932ccd-scripts\") pod \"horizon-547d7c7775-rc6qr\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " pod="openstack/horizon-547d7c7775-rc6qr" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.946950 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a62cb864-103c-4b89-afeb-8397af4046cb-db-sync-config-data\") pod \"barbican-db-sync-p69cr\" (UID: \"a62cb864-103c-4b89-afeb-8397af4046cb\") " pod="openstack/barbican-db-sync-p69cr" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.946992 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-config\") pod \"dnsmasq-dns-6ffb94d8ff-rztls\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.947131 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffb94d8ff-rztls\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.948094 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-config\") pod \"dnsmasq-dns-6ffb94d8ff-rztls\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.964756 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af7a03e1-cb13-4536-9405-791381101cdc-config-data\") pod \"placement-db-sync-srzft\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " pod="openstack/placement-db-sync-srzft" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.966229 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af7a03e1-cb13-4536-9405-791381101cdc-scripts\") pod \"placement-db-sync-srzft\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " pod="openstack/placement-db-sync-srzft" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.971034 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n9qq\" (UniqueName: \"kubernetes.io/projected/af7a03e1-cb13-4536-9405-791381101cdc-kube-api-access-5n9qq\") pod \"placement-db-sync-srzft\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " pod="openstack/placement-db-sync-srzft" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.971950 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w9v8\" (UniqueName: \"kubernetes.io/projected/11044bb2-8f41-45cd-a9c6-dc43709966a6-kube-api-access-7w9v8\") pod \"dnsmasq-dns-6ffb94d8ff-rztls\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" Dec 04 10:04:25 crc kubenswrapper[4693]: I1204 10:04:25.975117 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af7a03e1-cb13-4536-9405-791381101cdc-combined-ca-bundle\") pod \"placement-db-sync-srzft\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " pod="openstack/placement-db-sync-srzft" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.035753 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-srzft" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.048516 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/16682812-65a8-4824-a8ae-f3a0b9932ccd-horizon-secret-key\") pod \"horizon-547d7c7775-rc6qr\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " pod="openstack/horizon-547d7c7775-rc6qr" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.048593 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16682812-65a8-4824-a8ae-f3a0b9932ccd-logs\") pod \"horizon-547d7c7775-rc6qr\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " pod="openstack/horizon-547d7c7775-rc6qr" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.048617 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6w7s\" (UniqueName: \"kubernetes.io/projected/16682812-65a8-4824-a8ae-f3a0b9932ccd-kube-api-access-l6w7s\") pod \"horizon-547d7c7775-rc6qr\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " pod="openstack/horizon-547d7c7775-rc6qr" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.048638 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16682812-65a8-4824-a8ae-f3a0b9932ccd-config-data\") pod \"horizon-547d7c7775-rc6qr\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " pod="openstack/horizon-547d7c7775-rc6qr" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.048684 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/16682812-65a8-4824-a8ae-f3a0b9932ccd-scripts\") pod \"horizon-547d7c7775-rc6qr\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " pod="openstack/horizon-547d7c7775-rc6qr" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.048705 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a62cb864-103c-4b89-afeb-8397af4046cb-db-sync-config-data\") pod \"barbican-db-sync-p69cr\" (UID: \"a62cb864-103c-4b89-afeb-8397af4046cb\") " pod="openstack/barbican-db-sync-p69cr" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.048748 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62cb864-103c-4b89-afeb-8397af4046cb-combined-ca-bundle\") pod \"barbican-db-sync-p69cr\" (UID: \"a62cb864-103c-4b89-afeb-8397af4046cb\") " pod="openstack/barbican-db-sync-p69cr" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.048816 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tmz2\" (UniqueName: \"kubernetes.io/projected/a62cb864-103c-4b89-afeb-8397af4046cb-kube-api-access-6tmz2\") pod \"barbican-db-sync-p69cr\" (UID: \"a62cb864-103c-4b89-afeb-8397af4046cb\") " pod="openstack/barbican-db-sync-p69cr" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.050868 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.051445 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16682812-65a8-4824-a8ae-f3a0b9932ccd-logs\") pod \"horizon-547d7c7775-rc6qr\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " pod="openstack/horizon-547d7c7775-rc6qr" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.052798 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/16682812-65a8-4824-a8ae-f3a0b9932ccd-scripts\") pod \"horizon-547d7c7775-rc6qr\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " pod="openstack/horizon-547d7c7775-rc6qr" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.053251 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16682812-65a8-4824-a8ae-f3a0b9932ccd-config-data\") pod \"horizon-547d7c7775-rc6qr\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " pod="openstack/horizon-547d7c7775-rc6qr" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.059887 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/16682812-65a8-4824-a8ae-f3a0b9932ccd-horizon-secret-key\") pod \"horizon-547d7c7775-rc6qr\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " pod="openstack/horizon-547d7c7775-rc6qr" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.069519 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62cb864-103c-4b89-afeb-8397af4046cb-combined-ca-bundle\") pod \"barbican-db-sync-p69cr\" (UID: \"a62cb864-103c-4b89-afeb-8397af4046cb\") " pod="openstack/barbican-db-sync-p69cr" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.071918 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a62cb864-103c-4b89-afeb-8397af4046cb-db-sync-config-data\") pod \"barbican-db-sync-p69cr\" (UID: \"a62cb864-103c-4b89-afeb-8397af4046cb\") " pod="openstack/barbican-db-sync-p69cr" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.089813 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6w7s\" (UniqueName: \"kubernetes.io/projected/16682812-65a8-4824-a8ae-f3a0b9932ccd-kube-api-access-l6w7s\") pod \"horizon-547d7c7775-rc6qr\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " pod="openstack/horizon-547d7c7775-rc6qr" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.093015 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tmz2\" (UniqueName: \"kubernetes.io/projected/a62cb864-103c-4b89-afeb-8397af4046cb-kube-api-access-6tmz2\") pod \"barbican-db-sync-p69cr\" (UID: \"a62cb864-103c-4b89-afeb-8397af4046cb\") " pod="openstack/barbican-db-sync-p69cr" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.108701 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-547d7c7775-rc6qr" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.139799 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-p69cr" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.267688 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-wq9jf"] Dec 04 10:04:26 crc kubenswrapper[4693]: W1204 10:04:26.314212 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod953c5f43_bffd_4b82_9693_e0809449dc30.slice/crio-4a86385a191ada99df0cc26f9b4a6f23ea1f2c28e2d434f88a5e637cb043caa0 WatchSource:0}: Error finding container 4a86385a191ada99df0cc26f9b4a6f23ea1f2c28e2d434f88a5e637cb043caa0: Status 404 returned error can't find the container with id 4a86385a191ada99df0cc26f9b4a6f23ea1f2c28e2d434f88a5e637cb043caa0 Dec 04 10:04:26 crc kubenswrapper[4693]: W1204 10:04:26.327519 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80436510_952e_4346_9c8a_3dbd090866eb.slice/crio-a0d1211800268203a39f92005e7b6e948e6b36babd56b7084febbfc757a93447 WatchSource:0}: Error finding container a0d1211800268203a39f92005e7b6e948e6b36babd56b7084febbfc757a93447: Status 404 returned error can't find the container with id a0d1211800268203a39f92005e7b6e948e6b36babd56b7084febbfc757a93447 Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.329614 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rq2zb"] Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.519630 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-rztls"] Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.520074 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-mlf2h"] Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.528158 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.539423 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.557577 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-mlf2h"] Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.583678 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fc99g"] Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.682425 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m6hr\" (UniqueName: \"kubernetes.io/projected/d7a457be-015b-44a1-b55c-d0254008b53f-kube-api-access-7m6hr\") pod \"dnsmasq-dns-cf78879c9-mlf2h\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.682863 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-mlf2h\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.682906 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-dns-svc\") pod \"dnsmasq-dns-cf78879c9-mlf2h\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.682954 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-mlf2h\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.683006 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-mlf2h\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.683053 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-config\") pod \"dnsmasq-dns-cf78879c9-mlf2h\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.715218 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f7f97fc4f-7jrw9"] Dec 04 10:04:26 crc kubenswrapper[4693]: W1204 10:04:26.749632 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e9c528f_c36e_4182_9e00_1aca67f3d19c.slice/crio-94b0479c0851490a775e6ee11ff2c694e0ae512de0cc81e3b3ceedb21a33a873 WatchSource:0}: Error finding container 94b0479c0851490a775e6ee11ff2c694e0ae512de0cc81e3b3ceedb21a33a873: Status 404 returned error can't find the container with id 94b0479c0851490a775e6ee11ff2c694e0ae512de0cc81e3b3ceedb21a33a873 Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.785231 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-mlf2h\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.785299 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-mlf2h\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.785355 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-config\") pod \"dnsmasq-dns-cf78879c9-mlf2h\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.785444 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-mlf2h\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.785472 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m6hr\" (UniqueName: \"kubernetes.io/projected/d7a457be-015b-44a1-b55c-d0254008b53f-kube-api-access-7m6hr\") pod \"dnsmasq-dns-cf78879c9-mlf2h\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.785506 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-dns-svc\") pod \"dnsmasq-dns-cf78879c9-mlf2h\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.786499 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-mlf2h\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.786834 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-dns-svc\") pod \"dnsmasq-dns-cf78879c9-mlf2h\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.787187 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-config\") pod \"dnsmasq-dns-cf78879c9-mlf2h\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.787467 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-mlf2h\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.787782 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-mlf2h\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.826276 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m6hr\" (UniqueName: \"kubernetes.io/projected/d7a457be-015b-44a1-b55c-d0254008b53f-kube-api-access-7m6hr\") pod \"dnsmasq-dns-cf78879c9-mlf2h\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.897126 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.910119 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fc99g" event={"ID":"228babee-6748-4512-bd76-92168eab2e2d","Type":"ContainerStarted","Data":"a1dabcf3ca36ec22171d881e79f15ddebc3283b600489e4bf6b5f75cdce165a0"} Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.911118 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rq2zb" event={"ID":"953c5f43-bffd-4b82-9693-e0809449dc30","Type":"ContainerStarted","Data":"fa9176a09b30d68f093d6dca03844f9bce4164b1818bfc5bd839a618c29b688c"} Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.911143 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rq2zb" event={"ID":"953c5f43-bffd-4b82-9693-e0809449dc30","Type":"ContainerStarted","Data":"4a86385a191ada99df0cc26f9b4a6f23ea1f2c28e2d434f88a5e637cb043caa0"} Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.913786 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f7f97fc4f-7jrw9" event={"ID":"8e9c528f-c36e-4182-9e00-1aca67f3d19c","Type":"ContainerStarted","Data":"94b0479c0851490a775e6ee11ff2c694e0ae512de0cc81e3b3ceedb21a33a873"} Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.915878 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" event={"ID":"80436510-952e-4346-9c8a-3dbd090866eb","Type":"ContainerStarted","Data":"737755f881ca79ee9857f66eccc2527cd6c9b3b3a73a336fee913b0ff0483433"} Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.915952 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" event={"ID":"80436510-952e-4346-9c8a-3dbd090866eb","Type":"ContainerStarted","Data":"a0d1211800268203a39f92005e7b6e948e6b36babd56b7084febbfc757a93447"} Dec 04 10:04:26 crc kubenswrapper[4693]: I1204 10:04:26.918130 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" podUID="80436510-952e-4346-9c8a-3dbd090866eb" containerName="init" containerID="cri-o://737755f881ca79ee9857f66eccc2527cd6c9b3b3a73a336fee913b0ff0483433" gracePeriod=10 Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.014355 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rq2zb" podStartSLOduration=2.014324376 podStartE2EDuration="2.014324376s" podCreationTimestamp="2025-12-04 10:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:04:26.962583257 +0000 UTC m=+1312.860177010" watchObservedRunningTime="2025-12-04 10:04:27.014324376 +0000 UTC m=+1312.911918129" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.085773 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-q642m"] Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.116997 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.186888 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-srzft"] Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.222655 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-547d7c7775-rc6qr"] Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.247912 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-lvx99"] Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.290702 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-p69cr"] Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.309691 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-rztls"] Dec 04 10:04:27 crc kubenswrapper[4693]: W1204 10:04:27.359652 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11044bb2_8f41_45cd_a9c6_dc43709966a6.slice/crio-1bc038f9f622d2e700e382d16dd033cb5dfcb399dd10d05dd093e2732720efc3 WatchSource:0}: Error finding container 1bc038f9f622d2e700e382d16dd033cb5dfcb399dd10d05dd093e2732720efc3: Status 404 returned error can't find the container with id 1bc038f9f622d2e700e382d16dd033cb5dfcb399dd10d05dd093e2732720efc3 Dec 04 10:04:27 crc kubenswrapper[4693]: W1204 10:04:27.360441 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c3ad32c_eb55_44e3_bd1e_caa1409f0b1d.slice/crio-c1541dca6c712afeed5b633b60326b1cc5f4a5da804b36bbffa5c57f08fc1b03 WatchSource:0}: Error finding container c1541dca6c712afeed5b633b60326b1cc5f4a5da804b36bbffa5c57f08fc1b03: Status 404 returned error can't find the container with id c1541dca6c712afeed5b633b60326b1cc5f4a5da804b36bbffa5c57f08fc1b03 Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.568958 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.608472 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-ovsdbserver-nb\") pod \"80436510-952e-4346-9c8a-3dbd090866eb\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.608667 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-config\") pod \"80436510-952e-4346-9c8a-3dbd090866eb\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.608776 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-ovsdbserver-sb\") pod \"80436510-952e-4346-9c8a-3dbd090866eb\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.608814 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzf6g\" (UniqueName: \"kubernetes.io/projected/80436510-952e-4346-9c8a-3dbd090866eb-kube-api-access-tzf6g\") pod \"80436510-952e-4346-9c8a-3dbd090866eb\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.608872 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-dns-svc\") pod \"80436510-952e-4346-9c8a-3dbd090866eb\" (UID: \"80436510-952e-4346-9c8a-3dbd090866eb\") " Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.618759 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-547d7c7775-rc6qr"] Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.648553 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80436510-952e-4346-9c8a-3dbd090866eb-kube-api-access-tzf6g" (OuterVolumeSpecName: "kube-api-access-tzf6g") pod "80436510-952e-4346-9c8a-3dbd090866eb" (UID: "80436510-952e-4346-9c8a-3dbd090866eb"). InnerVolumeSpecName "kube-api-access-tzf6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.696924 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-845b6c7c9c-kdkqz"] Dec 04 10:04:27 crc kubenswrapper[4693]: E1204 10:04:27.697288 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80436510-952e-4346-9c8a-3dbd090866eb" containerName="init" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.697307 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="80436510-952e-4346-9c8a-3dbd090866eb" containerName="init" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.697980 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="80436510-952e-4346-9c8a-3dbd090866eb" containerName="init" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.699300 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-845b6c7c9c-kdkqz" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.712790 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzf6g\" (UniqueName: \"kubernetes.io/projected/80436510-952e-4346-9c8a-3dbd090866eb-kube-api-access-tzf6g\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.720483 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.730967 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-845b6c7c9c-kdkqz"] Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.753282 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "80436510-952e-4346-9c8a-3dbd090866eb" (UID: "80436510-952e-4346-9c8a-3dbd090866eb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.754214 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-config" (OuterVolumeSpecName: "config") pod "80436510-952e-4346-9c8a-3dbd090866eb" (UID: "80436510-952e-4346-9c8a-3dbd090866eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.756642 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "80436510-952e-4346-9c8a-3dbd090866eb" (UID: "80436510-952e-4346-9c8a-3dbd090866eb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.767602 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "80436510-952e-4346-9c8a-3dbd090866eb" (UID: "80436510-952e-4346-9c8a-3dbd090866eb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.782433 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-mlf2h"] Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.816412 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0497c95b-05c5-468e-a596-4636ae376f5d-config-data\") pod \"horizon-845b6c7c9c-kdkqz\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " pod="openstack/horizon-845b6c7c9c-kdkqz" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.816679 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0497c95b-05c5-468e-a596-4636ae376f5d-horizon-secret-key\") pod \"horizon-845b6c7c9c-kdkqz\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " pod="openstack/horizon-845b6c7c9c-kdkqz" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.816850 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0497c95b-05c5-468e-a596-4636ae376f5d-logs\") pod \"horizon-845b6c7c9c-kdkqz\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " pod="openstack/horizon-845b6c7c9c-kdkqz" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.816949 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fphf\" (UniqueName: \"kubernetes.io/projected/0497c95b-05c5-468e-a596-4636ae376f5d-kube-api-access-8fphf\") pod \"horizon-845b6c7c9c-kdkqz\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " pod="openstack/horizon-845b6c7c9c-kdkqz" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.817015 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0497c95b-05c5-468e-a596-4636ae376f5d-scripts\") pod \"horizon-845b6c7c9c-kdkqz\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " pod="openstack/horizon-845b6c7c9c-kdkqz" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.817222 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.817242 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.817254 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.817267 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80436510-952e-4346-9c8a-3dbd090866eb-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.923367 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0497c95b-05c5-468e-a596-4636ae376f5d-config-data\") pod \"horizon-845b6c7c9c-kdkqz\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " pod="openstack/horizon-845b6c7c9c-kdkqz" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.923447 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0497c95b-05c5-468e-a596-4636ae376f5d-horizon-secret-key\") pod \"horizon-845b6c7c9c-kdkqz\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " pod="openstack/horizon-845b6c7c9c-kdkqz" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.923482 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0497c95b-05c5-468e-a596-4636ae376f5d-logs\") pod \"horizon-845b6c7c9c-kdkqz\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " pod="openstack/horizon-845b6c7c9c-kdkqz" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.923516 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fphf\" (UniqueName: \"kubernetes.io/projected/0497c95b-05c5-468e-a596-4636ae376f5d-kube-api-access-8fphf\") pod \"horizon-845b6c7c9c-kdkqz\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " pod="openstack/horizon-845b6c7c9c-kdkqz" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.923540 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0497c95b-05c5-468e-a596-4636ae376f5d-scripts\") pod \"horizon-845b6c7c9c-kdkqz\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " pod="openstack/horizon-845b6c7c9c-kdkqz" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.924261 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0497c95b-05c5-468e-a596-4636ae376f5d-scripts\") pod \"horizon-845b6c7c9c-kdkqz\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " pod="openstack/horizon-845b6c7c9c-kdkqz" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.924266 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0497c95b-05c5-468e-a596-4636ae376f5d-logs\") pod \"horizon-845b6c7c9c-kdkqz\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " pod="openstack/horizon-845b6c7c9c-kdkqz" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.924908 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0497c95b-05c5-468e-a596-4636ae376f5d-config-data\") pod \"horizon-845b6c7c9c-kdkqz\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " pod="openstack/horizon-845b6c7c9c-kdkqz" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.934313 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0497c95b-05c5-468e-a596-4636ae376f5d-horizon-secret-key\") pod \"horizon-845b6c7c9c-kdkqz\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " pod="openstack/horizon-845b6c7c9c-kdkqz" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.938644 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-547d7c7775-rc6qr" event={"ID":"16682812-65a8-4824-a8ae-f3a0b9932ccd","Type":"ContainerStarted","Data":"ba800a2d33e0804c52d37676458fd10b1d56cd887ac970316ff19cb3feac3380"} Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.940974 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fphf\" (UniqueName: \"kubernetes.io/projected/0497c95b-05c5-468e-a596-4636ae376f5d-kube-api-access-8fphf\") pod \"horizon-845b6c7c9c-kdkqz\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " pod="openstack/horizon-845b6c7c9c-kdkqz" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.941543 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-p69cr" event={"ID":"a62cb864-103c-4b89-afeb-8397af4046cb","Type":"ContainerStarted","Data":"ec4415adebedb9e0cf252ebac162686d060affc48b4259f371212f89081a7465"} Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.943389 4693 generic.go:334] "Generic (PLEG): container finished" podID="80436510-952e-4346-9c8a-3dbd090866eb" containerID="737755f881ca79ee9857f66eccc2527cd6c9b3b3a73a336fee913b0ff0483433" exitCode=0 Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.943439 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" event={"ID":"80436510-952e-4346-9c8a-3dbd090866eb","Type":"ContainerDied","Data":"737755f881ca79ee9857f66eccc2527cd6c9b3b3a73a336fee913b0ff0483433"} Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.943457 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" event={"ID":"80436510-952e-4346-9c8a-3dbd090866eb","Type":"ContainerDied","Data":"a0d1211800268203a39f92005e7b6e948e6b36babd56b7084febbfc757a93447"} Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.943474 4693 scope.go:117] "RemoveContainer" containerID="737755f881ca79ee9857f66eccc2527cd6c9b3b3a73a336fee913b0ff0483433" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.943581 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-wq9jf" Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.953651 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lvx99" event={"ID":"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d","Type":"ContainerStarted","Data":"c1541dca6c712afeed5b633b60326b1cc5f4a5da804b36bbffa5c57f08fc1b03"} Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.957043 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" event={"ID":"d7a457be-015b-44a1-b55c-d0254008b53f","Type":"ContainerStarted","Data":"6b46f750a89e37df6b68986ecfe97c5373d50fbc0a79568c6b6a61cc1d3ceaf8"} Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.967917 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edf292dd-e2f4-4e80-ab9a-3e548118489a","Type":"ContainerStarted","Data":"94dd8bf202506c2751dcbc9c3a21f70d2ea17bc2df286815d3fa2720fee25c2c"} Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.971918 4693 generic.go:334] "Generic (PLEG): container finished" podID="11044bb2-8f41-45cd-a9c6-dc43709966a6" containerID="1d44e2083ab5fb37b07231b26e0b0634da1e95a5e2a7a42025bc7a10ae88f63d" exitCode=0 Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.972034 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" event={"ID":"11044bb2-8f41-45cd-a9c6-dc43709966a6","Type":"ContainerDied","Data":"1d44e2083ab5fb37b07231b26e0b0634da1e95a5e2a7a42025bc7a10ae88f63d"} Dec 04 10:04:27 crc kubenswrapper[4693]: I1204 10:04:27.972059 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" event={"ID":"11044bb2-8f41-45cd-a9c6-dc43709966a6","Type":"ContainerStarted","Data":"1bc038f9f622d2e700e382d16dd033cb5dfcb399dd10d05dd093e2732720efc3"} Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.024159 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-wq9jf"] Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.026382 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-q642m" event={"ID":"832e603c-b695-442e-bcf6-fa322cfc1524","Type":"ContainerStarted","Data":"a7cd6b439a4228953c57b6b93122ffcf4dc2b462426e0856cccdc0f580ab0b13"} Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.026497 4693 scope.go:117] "RemoveContainer" containerID="737755f881ca79ee9857f66eccc2527cd6c9b3b3a73a336fee913b0ff0483433" Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.040889 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-845b6c7c9c-kdkqz" Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.041593 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-wq9jf"] Dec 04 10:04:28 crc kubenswrapper[4693]: E1204 10:04:28.052096 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"737755f881ca79ee9857f66eccc2527cd6c9b3b3a73a336fee913b0ff0483433\": container with ID starting with 737755f881ca79ee9857f66eccc2527cd6c9b3b3a73a336fee913b0ff0483433 not found: ID does not exist" containerID="737755f881ca79ee9857f66eccc2527cd6c9b3b3a73a336fee913b0ff0483433" Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.052152 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737755f881ca79ee9857f66eccc2527cd6c9b3b3a73a336fee913b0ff0483433"} err="failed to get container status \"737755f881ca79ee9857f66eccc2527cd6c9b3b3a73a336fee913b0ff0483433\": rpc error: code = NotFound desc = could not find container \"737755f881ca79ee9857f66eccc2527cd6c9b3b3a73a336fee913b0ff0483433\": container with ID starting with 737755f881ca79ee9857f66eccc2527cd6c9b3b3a73a336fee913b0ff0483433 not found: ID does not exist" Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.054325 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fc99g" event={"ID":"228babee-6748-4512-bd76-92168eab2e2d","Type":"ContainerStarted","Data":"bff8e3ebeba716474615ecef4839fdd6eea301daa1ae5f089ff8b5e4b1c0f79b"} Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.057916 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-srzft" event={"ID":"af7a03e1-cb13-4536-9405-791381101cdc","Type":"ContainerStarted","Data":"5d42c355992d6deb77facf0f2e868daece9b78cd6c1c706a85c4823fa0b63cc1"} Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.099846 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-fc99g" podStartSLOduration=3.099824283 podStartE2EDuration="3.099824283s" podCreationTimestamp="2025-12-04 10:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:04:28.081497237 +0000 UTC m=+1313.979091010" watchObservedRunningTime="2025-12-04 10:04:28.099824283 +0000 UTC m=+1313.997418036" Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.474892 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80436510-952e-4346-9c8a-3dbd090866eb" path="/var/lib/kubelet/pods/80436510-952e-4346-9c8a-3dbd090866eb/volumes" Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.522850 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.547907 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-config\") pod \"11044bb2-8f41-45cd-a9c6-dc43709966a6\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.547983 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w9v8\" (UniqueName: \"kubernetes.io/projected/11044bb2-8f41-45cd-a9c6-dc43709966a6-kube-api-access-7w9v8\") pod \"11044bb2-8f41-45cd-a9c6-dc43709966a6\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.548069 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-dns-svc\") pod \"11044bb2-8f41-45cd-a9c6-dc43709966a6\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.548255 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-ovsdbserver-nb\") pod \"11044bb2-8f41-45cd-a9c6-dc43709966a6\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.548292 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-ovsdbserver-sb\") pod \"11044bb2-8f41-45cd-a9c6-dc43709966a6\" (UID: \"11044bb2-8f41-45cd-a9c6-dc43709966a6\") " Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.557242 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11044bb2-8f41-45cd-a9c6-dc43709966a6-kube-api-access-7w9v8" (OuterVolumeSpecName: "kube-api-access-7w9v8") pod "11044bb2-8f41-45cd-a9c6-dc43709966a6" (UID: "11044bb2-8f41-45cd-a9c6-dc43709966a6"). InnerVolumeSpecName "kube-api-access-7w9v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.577673 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "11044bb2-8f41-45cd-a9c6-dc43709966a6" (UID: "11044bb2-8f41-45cd-a9c6-dc43709966a6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.577686 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "11044bb2-8f41-45cd-a9c6-dc43709966a6" (UID: "11044bb2-8f41-45cd-a9c6-dc43709966a6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.591014 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-config" (OuterVolumeSpecName: "config") pod "11044bb2-8f41-45cd-a9c6-dc43709966a6" (UID: "11044bb2-8f41-45cd-a9c6-dc43709966a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.595090 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "11044bb2-8f41-45cd-a9c6-dc43709966a6" (UID: "11044bb2-8f41-45cd-a9c6-dc43709966a6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.654325 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.654403 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.654413 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.654423 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w9v8\" (UniqueName: \"kubernetes.io/projected/11044bb2-8f41-45cd-a9c6-dc43709966a6-kube-api-access-7w9v8\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.654437 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11044bb2-8f41-45cd-a9c6-dc43709966a6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:28 crc kubenswrapper[4693]: I1204 10:04:28.657299 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-845b6c7c9c-kdkqz"] Dec 04 10:04:28 crc kubenswrapper[4693]: W1204 10:04:28.665391 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0497c95b_05c5_468e_a596_4636ae376f5d.slice/crio-dd4a6f8ba151234f4c321d07726e93739d7144e26fb52129dd412d3b71a2e821 WatchSource:0}: Error finding container dd4a6f8ba151234f4c321d07726e93739d7144e26fb52129dd412d3b71a2e821: Status 404 returned error can't find the container with id dd4a6f8ba151234f4c321d07726e93739d7144e26fb52129dd412d3b71a2e821 Dec 04 10:04:29 crc kubenswrapper[4693]: I1204 10:04:29.083796 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" Dec 04 10:04:29 crc kubenswrapper[4693]: I1204 10:04:29.084102 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-rztls" event={"ID":"11044bb2-8f41-45cd-a9c6-dc43709966a6","Type":"ContainerDied","Data":"1bc038f9f622d2e700e382d16dd033cb5dfcb399dd10d05dd093e2732720efc3"} Dec 04 10:04:29 crc kubenswrapper[4693]: I1204 10:04:29.084157 4693 scope.go:117] "RemoveContainer" containerID="1d44e2083ab5fb37b07231b26e0b0634da1e95a5e2a7a42025bc7a10ae88f63d" Dec 04 10:04:29 crc kubenswrapper[4693]: I1204 10:04:29.105403 4693 generic.go:334] "Generic (PLEG): container finished" podID="d7a457be-015b-44a1-b55c-d0254008b53f" containerID="af2fefd302a70b31602db0fcbd43f23149a6b5efa86648d1f2fbfc800903d463" exitCode=0 Dec 04 10:04:29 crc kubenswrapper[4693]: I1204 10:04:29.105487 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" event={"ID":"d7a457be-015b-44a1-b55c-d0254008b53f","Type":"ContainerDied","Data":"af2fefd302a70b31602db0fcbd43f23149a6b5efa86648d1f2fbfc800903d463"} Dec 04 10:04:29 crc kubenswrapper[4693]: I1204 10:04:29.109921 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-845b6c7c9c-kdkqz" event={"ID":"0497c95b-05c5-468e-a596-4636ae376f5d","Type":"ContainerStarted","Data":"dd4a6f8ba151234f4c321d07726e93739d7144e26fb52129dd412d3b71a2e821"} Dec 04 10:04:29 crc kubenswrapper[4693]: I1204 10:04:29.176088 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-rztls"] Dec 04 10:04:29 crc kubenswrapper[4693]: I1204 10:04:29.213902 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-rztls"] Dec 04 10:04:30 crc kubenswrapper[4693]: I1204 10:04:30.479050 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11044bb2-8f41-45cd-a9c6-dc43709966a6" path="/var/lib/kubelet/pods/11044bb2-8f41-45cd-a9c6-dc43709966a6/volumes" Dec 04 10:04:31 crc kubenswrapper[4693]: I1204 10:04:31.143071 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" event={"ID":"d7a457be-015b-44a1-b55c-d0254008b53f","Type":"ContainerStarted","Data":"e5dedb9ba7c84c0ecc079aba1c170dec64e90bed927ebba1c4091fe857eafe51"} Dec 04 10:04:31 crc kubenswrapper[4693]: I1204 10:04:31.143235 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:31 crc kubenswrapper[4693]: I1204 10:04:31.171012 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" podStartSLOduration=5.170992919 podStartE2EDuration="5.170992919s" podCreationTimestamp="2025-12-04 10:04:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:04:31.165112046 +0000 UTC m=+1317.062705799" watchObservedRunningTime="2025-12-04 10:04:31.170992919 +0000 UTC m=+1317.068586672" Dec 04 10:04:33 crc kubenswrapper[4693]: I1204 10:04:33.167219 4693 generic.go:334] "Generic (PLEG): container finished" podID="953c5f43-bffd-4b82-9693-e0809449dc30" containerID="fa9176a09b30d68f093d6dca03844f9bce4164b1818bfc5bd839a618c29b688c" exitCode=0 Dec 04 10:04:33 crc kubenswrapper[4693]: I1204 10:04:33.167302 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rq2zb" event={"ID":"953c5f43-bffd-4b82-9693-e0809449dc30","Type":"ContainerDied","Data":"fa9176a09b30d68f093d6dca03844f9bce4164b1818bfc5bd839a618c29b688c"} Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.519169 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f7f97fc4f-7jrw9"] Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.532087 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7649787bc6-fddzc"] Dec 04 10:04:40 crc kubenswrapper[4693]: E1204 10:04:34.533470 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11044bb2-8f41-45cd-a9c6-dc43709966a6" containerName="init" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.533494 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="11044bb2-8f41-45cd-a9c6-dc43709966a6" containerName="init" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.533732 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="11044bb2-8f41-45cd-a9c6-dc43709966a6" containerName="init" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.534904 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.542826 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.563755 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7649787bc6-fddzc"] Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.675733 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-845b6c7c9c-kdkqz"] Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.700027 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d096d1f-bf52-413c-9cb9-4c89179e5725-combined-ca-bundle\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.700074 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d096d1f-bf52-413c-9cb9-4c89179e5725-scripts\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.700275 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d096d1f-bf52-413c-9cb9-4c89179e5725-config-data\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.700343 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d096d1f-bf52-413c-9cb9-4c89179e5725-horizon-tls-certs\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.700483 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d096d1f-bf52-413c-9cb9-4c89179e5725-logs\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.700527 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99z4b\" (UniqueName: \"kubernetes.io/projected/5d096d1f-bf52-413c-9cb9-4c89179e5725-kube-api-access-99z4b\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.700609 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d096d1f-bf52-413c-9cb9-4c89179e5725-horizon-secret-key\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.732579 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f8cd9d6cb-vf5bx"] Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.734414 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.741784 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f8cd9d6cb-vf5bx"] Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.802216 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d096d1f-bf52-413c-9cb9-4c89179e5725-config-data\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.802270 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d096d1f-bf52-413c-9cb9-4c89179e5725-horizon-tls-certs\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.802363 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d096d1f-bf52-413c-9cb9-4c89179e5725-logs\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.802399 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99z4b\" (UniqueName: \"kubernetes.io/projected/5d096d1f-bf52-413c-9cb9-4c89179e5725-kube-api-access-99z4b\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.802450 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d096d1f-bf52-413c-9cb9-4c89179e5725-horizon-secret-key\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.802516 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d096d1f-bf52-413c-9cb9-4c89179e5725-combined-ca-bundle\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.802537 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d096d1f-bf52-413c-9cb9-4c89179e5725-scripts\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.803272 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d096d1f-bf52-413c-9cb9-4c89179e5725-logs\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.803440 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d096d1f-bf52-413c-9cb9-4c89179e5725-scripts\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.805747 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d096d1f-bf52-413c-9cb9-4c89179e5725-config-data\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.808605 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d096d1f-bf52-413c-9cb9-4c89179e5725-horizon-tls-certs\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.808935 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d096d1f-bf52-413c-9cb9-4c89179e5725-horizon-secret-key\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.808978 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d096d1f-bf52-413c-9cb9-4c89179e5725-combined-ca-bundle\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.820124 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99z4b\" (UniqueName: \"kubernetes.io/projected/5d096d1f-bf52-413c-9cb9-4c89179e5725-kube-api-access-99z4b\") pod \"horizon-7649787bc6-fddzc\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.863070 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.904399 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca2592b0-5dfd-4d15-996c-2340af86bd26-logs\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.904444 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca2592b0-5dfd-4d15-996c-2340af86bd26-horizon-tls-certs\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.904496 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg9gw\" (UniqueName: \"kubernetes.io/projected/ca2592b0-5dfd-4d15-996c-2340af86bd26-kube-api-access-zg9gw\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.904535 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca2592b0-5dfd-4d15-996c-2340af86bd26-scripts\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.904568 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2592b0-5dfd-4d15-996c-2340af86bd26-combined-ca-bundle\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.904585 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ca2592b0-5dfd-4d15-996c-2340af86bd26-horizon-secret-key\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:34.904621 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca2592b0-5dfd-4d15-996c-2340af86bd26-config-data\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:35.006852 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca2592b0-5dfd-4d15-996c-2340af86bd26-logs\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:35.006916 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca2592b0-5dfd-4d15-996c-2340af86bd26-horizon-tls-certs\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:35.006986 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg9gw\" (UniqueName: \"kubernetes.io/projected/ca2592b0-5dfd-4d15-996c-2340af86bd26-kube-api-access-zg9gw\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:35.007049 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca2592b0-5dfd-4d15-996c-2340af86bd26-scripts\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:35.007099 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2592b0-5dfd-4d15-996c-2340af86bd26-combined-ca-bundle\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:35.007124 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ca2592b0-5dfd-4d15-996c-2340af86bd26-horizon-secret-key\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:35.007173 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca2592b0-5dfd-4d15-996c-2340af86bd26-config-data\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:35.007411 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca2592b0-5dfd-4d15-996c-2340af86bd26-logs\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:35.008126 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca2592b0-5dfd-4d15-996c-2340af86bd26-scripts\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:35.009369 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca2592b0-5dfd-4d15-996c-2340af86bd26-config-data\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:35.011109 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ca2592b0-5dfd-4d15-996c-2340af86bd26-horizon-secret-key\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:35.011572 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca2592b0-5dfd-4d15-996c-2340af86bd26-horizon-tls-certs\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:35.013292 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2592b0-5dfd-4d15-996c-2340af86bd26-combined-ca-bundle\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:35.024426 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg9gw\" (UniqueName: \"kubernetes.io/projected/ca2592b0-5dfd-4d15-996c-2340af86bd26-kube-api-access-zg9gw\") pod \"horizon-6f8cd9d6cb-vf5bx\" (UID: \"ca2592b0-5dfd-4d15-996c-2340af86bd26\") " pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:35.047151 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:36.899562 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:36.954539 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-8vk68"] Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:36.954813 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" podUID="334678ba-c391-4eb7-a693-37bb8dde6c26" containerName="dnsmasq-dns" containerID="cri-o://e0228ed7b3806a077c1d1ecb90531aeadea88cbcd2c784cb1a25ac1bf04b7ee8" gracePeriod=10 Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:37.715246 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" podUID="334678ba-c391-4eb7-a693-37bb8dde6c26" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:38.214731 4693 generic.go:334] "Generic (PLEG): container finished" podID="334678ba-c391-4eb7-a693-37bb8dde6c26" containerID="e0228ed7b3806a077c1d1ecb90531aeadea88cbcd2c784cb1a25ac1bf04b7ee8" exitCode=0 Dec 04 10:04:40 crc kubenswrapper[4693]: I1204 10:04:38.214774 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" event={"ID":"334678ba-c391-4eb7-a693-37bb8dde6c26","Type":"ContainerDied","Data":"e0228ed7b3806a077c1d1ecb90531aeadea88cbcd2c784cb1a25ac1bf04b7ee8"} Dec 04 10:04:42 crc kubenswrapper[4693]: I1204 10:04:42.715034 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" podUID="334678ba-c391-4eb7-a693-37bb8dde6c26" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Dec 04 10:04:44 crc kubenswrapper[4693]: E1204 10:04:44.948599 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 04 10:04:44 crc kubenswrapper[4693]: E1204 10:04:44.949142 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n75h687h54dh7h57fh9bh698h679hd4hc8h595h567h577h8dh64dh66h547h5bfhb5h667hd5h5bh58dh56ch95h87h545hb7h9ch698h5hd4q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q9nh8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6f7f97fc4f-7jrw9_openstack(8e9c528f-c36e-4182-9e00-1aca67f3d19c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:04:44 crc kubenswrapper[4693]: E1204 10:04:44.952986 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6f7f97fc4f-7jrw9" podUID="8e9c528f-c36e-4182-9e00-1aca67f3d19c" Dec 04 10:04:47 crc kubenswrapper[4693]: E1204 10:04:47.510783 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 04 10:04:47 crc kubenswrapper[4693]: E1204 10:04:47.511220 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n646h67chf6h645h5f9h95h696h59ch84h64dh5fh547h75h54ch8chd7h5fh679h98h68dhb5h675h99h5f4hf8h66bh645h67ch55fh74h5c9h5c4q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8fphf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-845b6c7c9c-kdkqz_openstack(0497c95b-05c5-468e-a596-4636ae376f5d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:04:47 crc kubenswrapper[4693]: E1204 10:04:47.513590 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-845b6c7c9c-kdkqz" podUID="0497c95b-05c5-468e-a596-4636ae376f5d" Dec 04 10:04:47 crc kubenswrapper[4693]: I1204 10:04:47.715403 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" podUID="334678ba-c391-4eb7-a693-37bb8dde6c26" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Dec 04 10:04:47 crc kubenswrapper[4693]: I1204 10:04:47.715574 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:04:48 crc kubenswrapper[4693]: I1204 10:04:48.334577 4693 generic.go:334] "Generic (PLEG): container finished" podID="6343580b-81bd-4993-a298-3b31730e6ae3" containerID="fbf61924d8fb146210c44e926db797ef98bfd70f46df952c59704501d3c5dbb2" exitCode=0 Dec 04 10:04:48 crc kubenswrapper[4693]: I1204 10:04:48.334743 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9q6d6" event={"ID":"6343580b-81bd-4993-a298-3b31730e6ae3","Type":"ContainerDied","Data":"fbf61924d8fb146210c44e926db797ef98bfd70f46df952c59704501d3c5dbb2"} Dec 04 10:04:49 crc kubenswrapper[4693]: E1204 10:04:49.209468 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 04 10:04:49 crc kubenswrapper[4693]: E1204 10:04:49.209857 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n9dhd6h594h57fh76h646h574h77h5d8h5c5h5dh58ch74h598h59fh5c9hf5h5f4h99h7bhc5hbfhdch666h556hd4h68dh67h6bh6ch665h6dq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l6w7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-547d7c7775-rc6qr_openstack(16682812-65a8-4824-a8ae-f3a0b9932ccd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:04:49 crc kubenswrapper[4693]: E1204 10:04:49.212255 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-547d7c7775-rc6qr" podUID="16682812-65a8-4824-a8ae-f3a0b9932ccd" Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.309911 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.349413 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rq2zb" Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.349838 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rq2zb" event={"ID":"953c5f43-bffd-4b82-9693-e0809449dc30","Type":"ContainerDied","Data":"4a86385a191ada99df0cc26f9b4a6f23ea1f2c28e2d434f88a5e637cb043caa0"} Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.349871 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a86385a191ada99df0cc26f9b4a6f23ea1f2c28e2d434f88a5e637cb043caa0" Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.379662 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-combined-ca-bundle\") pod \"953c5f43-bffd-4b82-9693-e0809449dc30\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.379702 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-scripts\") pod \"953c5f43-bffd-4b82-9693-e0809449dc30\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.379753 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-credential-keys\") pod \"953c5f43-bffd-4b82-9693-e0809449dc30\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.379774 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht7jj\" (UniqueName: \"kubernetes.io/projected/953c5f43-bffd-4b82-9693-e0809449dc30-kube-api-access-ht7jj\") pod \"953c5f43-bffd-4b82-9693-e0809449dc30\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.379800 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-config-data\") pod \"953c5f43-bffd-4b82-9693-e0809449dc30\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.379844 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-fernet-keys\") pod \"953c5f43-bffd-4b82-9693-e0809449dc30\" (UID: \"953c5f43-bffd-4b82-9693-e0809449dc30\") " Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.390681 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "953c5f43-bffd-4b82-9693-e0809449dc30" (UID: "953c5f43-bffd-4b82-9693-e0809449dc30"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.396833 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/953c5f43-bffd-4b82-9693-e0809449dc30-kube-api-access-ht7jj" (OuterVolumeSpecName: "kube-api-access-ht7jj") pod "953c5f43-bffd-4b82-9693-e0809449dc30" (UID: "953c5f43-bffd-4b82-9693-e0809449dc30"). InnerVolumeSpecName "kube-api-access-ht7jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.397983 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-scripts" (OuterVolumeSpecName: "scripts") pod "953c5f43-bffd-4b82-9693-e0809449dc30" (UID: "953c5f43-bffd-4b82-9693-e0809449dc30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.404497 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "953c5f43-bffd-4b82-9693-e0809449dc30" (UID: "953c5f43-bffd-4b82-9693-e0809449dc30"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.435237 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-config-data" (OuterVolumeSpecName: "config-data") pod "953c5f43-bffd-4b82-9693-e0809449dc30" (UID: "953c5f43-bffd-4b82-9693-e0809449dc30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.454457 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "953c5f43-bffd-4b82-9693-e0809449dc30" (UID: "953c5f43-bffd-4b82-9693-e0809449dc30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.484052 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.484089 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.484102 4693 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.484113 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht7jj\" (UniqueName: \"kubernetes.io/projected/953c5f43-bffd-4b82-9693-e0809449dc30-kube-api-access-ht7jj\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.484125 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.484135 4693 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/953c5f43-bffd-4b82-9693-e0809449dc30-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:49 crc kubenswrapper[4693]: E1204 10:04:49.882947 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-manila-api:current-podified" Dec 04 10:04:49 crc kubenswrapper[4693]: E1204 10:04:49.883146 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manila-db-sync,Image:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,Command:[/bin/bash],Args:[-c sleep 0 && /usr/bin/manila-manage --config-dir /etc/manila/manila.conf.d db sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:job-config-data,ReadOnly:true,MountPath:/etc/manila/manila.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bcqzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42429,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42429,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-db-sync-lvx99_openstack(7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:04:49 crc kubenswrapper[4693]: E1204 10:04:49.884351 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manila-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/manila-db-sync-lvx99" podUID="7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d" Dec 04 10:04:49 crc kubenswrapper[4693]: I1204 10:04:49.956570 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9q6d6" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.121104 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6343580b-81bd-4993-a298-3b31730e6ae3-config-data\") pod \"6343580b-81bd-4993-a298-3b31730e6ae3\" (UID: \"6343580b-81bd-4993-a298-3b31730e6ae3\") " Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.121575 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6343580b-81bd-4993-a298-3b31730e6ae3-db-sync-config-data\") pod \"6343580b-81bd-4993-a298-3b31730e6ae3\" (UID: \"6343580b-81bd-4993-a298-3b31730e6ae3\") " Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.121665 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr8d7\" (UniqueName: \"kubernetes.io/projected/6343580b-81bd-4993-a298-3b31730e6ae3-kube-api-access-dr8d7\") pod \"6343580b-81bd-4993-a298-3b31730e6ae3\" (UID: \"6343580b-81bd-4993-a298-3b31730e6ae3\") " Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.121778 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6343580b-81bd-4993-a298-3b31730e6ae3-combined-ca-bundle\") pod \"6343580b-81bd-4993-a298-3b31730e6ae3\" (UID: \"6343580b-81bd-4993-a298-3b31730e6ae3\") " Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.126599 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6343580b-81bd-4993-a298-3b31730e6ae3-kube-api-access-dr8d7" (OuterVolumeSpecName: "kube-api-access-dr8d7") pod "6343580b-81bd-4993-a298-3b31730e6ae3" (UID: "6343580b-81bd-4993-a298-3b31730e6ae3"). InnerVolumeSpecName "kube-api-access-dr8d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.127673 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6343580b-81bd-4993-a298-3b31730e6ae3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6343580b-81bd-4993-a298-3b31730e6ae3" (UID: "6343580b-81bd-4993-a298-3b31730e6ae3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.146594 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6343580b-81bd-4993-a298-3b31730e6ae3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6343580b-81bd-4993-a298-3b31730e6ae3" (UID: "6343580b-81bd-4993-a298-3b31730e6ae3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.166315 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6343580b-81bd-4993-a298-3b31730e6ae3-config-data" (OuterVolumeSpecName: "config-data") pod "6343580b-81bd-4993-a298-3b31730e6ae3" (UID: "6343580b-81bd-4993-a298-3b31730e6ae3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.224530 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6343580b-81bd-4993-a298-3b31730e6ae3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.224558 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6343580b-81bd-4993-a298-3b31730e6ae3-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.224571 4693 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6343580b-81bd-4993-a298-3b31730e6ae3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.224580 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr8d7\" (UniqueName: \"kubernetes.io/projected/6343580b-81bd-4993-a298-3b31730e6ae3-kube-api-access-dr8d7\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.380839 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9q6d6" event={"ID":"6343580b-81bd-4993-a298-3b31730e6ae3","Type":"ContainerDied","Data":"b39baba43558b8cc997c029362a65f3366d1bb5f87bac260311832c77619c2bd"} Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.380883 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9q6d6" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.380903 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b39baba43558b8cc997c029362a65f3366d1bb5f87bac260311832c77619c2bd" Dec 04 10:04:50 crc kubenswrapper[4693]: E1204 10:04:50.382587 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manila-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-manila-api:current-podified\\\"\"" pod="openstack/manila-db-sync-lvx99" podUID="7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.400538 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rq2zb"] Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.412718 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rq2zb"] Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.477307 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="953c5f43-bffd-4b82-9693-e0809449dc30" path="/var/lib/kubelet/pods/953c5f43-bffd-4b82-9693-e0809449dc30/volumes" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.485577 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jw5j4"] Dec 04 10:04:50 crc kubenswrapper[4693]: E1204 10:04:50.488467 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="953c5f43-bffd-4b82-9693-e0809449dc30" containerName="keystone-bootstrap" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.488496 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="953c5f43-bffd-4b82-9693-e0809449dc30" containerName="keystone-bootstrap" Dec 04 10:04:50 crc kubenswrapper[4693]: E1204 10:04:50.488515 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6343580b-81bd-4993-a298-3b31730e6ae3" containerName="glance-db-sync" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.488522 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6343580b-81bd-4993-a298-3b31730e6ae3" containerName="glance-db-sync" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.488703 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="953c5f43-bffd-4b82-9693-e0809449dc30" containerName="keystone-bootstrap" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.488723 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6343580b-81bd-4993-a298-3b31730e6ae3" containerName="glance-db-sync" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.489297 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.492447 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.492662 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.492835 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.493126 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.493661 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9fq9s" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.498325 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jw5j4"] Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.637573 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-fernet-keys\") pod \"keystone-bootstrap-jw5j4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.637613 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-credential-keys\") pod \"keystone-bootstrap-jw5j4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.637693 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-config-data\") pod \"keystone-bootstrap-jw5j4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.637727 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mbzx\" (UniqueName: \"kubernetes.io/projected/00f3c238-cf53-4a99-96da-ae6118b711b4-kube-api-access-4mbzx\") pod \"keystone-bootstrap-jw5j4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.637779 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-combined-ca-bundle\") pod \"keystone-bootstrap-jw5j4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.637799 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-scripts\") pod \"keystone-bootstrap-jw5j4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.739114 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-combined-ca-bundle\") pod \"keystone-bootstrap-jw5j4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.739179 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-scripts\") pod \"keystone-bootstrap-jw5j4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.739257 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-fernet-keys\") pod \"keystone-bootstrap-jw5j4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.739278 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-credential-keys\") pod \"keystone-bootstrap-jw5j4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.739334 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-config-data\") pod \"keystone-bootstrap-jw5j4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.739385 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mbzx\" (UniqueName: \"kubernetes.io/projected/00f3c238-cf53-4a99-96da-ae6118b711b4-kube-api-access-4mbzx\") pod \"keystone-bootstrap-jw5j4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.745941 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-config-data\") pod \"keystone-bootstrap-jw5j4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.748078 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-fernet-keys\") pod \"keystone-bootstrap-jw5j4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.750944 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-combined-ca-bundle\") pod \"keystone-bootstrap-jw5j4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.750957 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-credential-keys\") pod \"keystone-bootstrap-jw5j4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.759059 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-scripts\") pod \"keystone-bootstrap-jw5j4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.766132 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-9t76h"] Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.767640 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.778111 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mbzx\" (UniqueName: \"kubernetes.io/projected/00f3c238-cf53-4a99-96da-ae6118b711b4-kube-api-access-4mbzx\") pod \"keystone-bootstrap-jw5j4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.809282 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-9t76h"] Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.809902 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.942432 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-9t76h\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.942536 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-9t76h\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.942575 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt84q\" (UniqueName: \"kubernetes.io/projected/ca4b19dd-d54c-4531-aeca-8cb22716b387-kube-api-access-jt84q\") pod \"dnsmasq-dns-56df8fb6b7-9t76h\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.942600 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-config\") pod \"dnsmasq-dns-56df8fb6b7-9t76h\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.942661 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-9t76h\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:50 crc kubenswrapper[4693]: I1204 10:04:50.942710 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-9t76h\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.044189 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt84q\" (UniqueName: \"kubernetes.io/projected/ca4b19dd-d54c-4531-aeca-8cb22716b387-kube-api-access-jt84q\") pod \"dnsmasq-dns-56df8fb6b7-9t76h\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.044235 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-config\") pod \"dnsmasq-dns-56df8fb6b7-9t76h\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.044291 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-9t76h\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.044342 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-9t76h\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.044386 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-9t76h\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.044440 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-9t76h\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.045369 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-9t76h\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.045467 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-9t76h\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.046129 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-9t76h\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.046142 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-config\") pod \"dnsmasq-dns-56df8fb6b7-9t76h\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.046668 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-9t76h\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.064989 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt84q\" (UniqueName: \"kubernetes.io/projected/ca4b19dd-d54c-4531-aeca-8cb22716b387-kube-api-access-jt84q\") pod \"dnsmasq-dns-56df8fb6b7-9t76h\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.196666 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.667588 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.669722 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.671947 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.672219 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mdxbt" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.672353 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.672403 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.686493 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.864575 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-ceph\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.864677 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.865066 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.865244 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.865371 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.867542 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-logs\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.867611 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.867636 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5nfp\" (UniqueName: \"kubernetes.io/projected/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-kube-api-access-k5nfp\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.969228 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.969269 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.969297 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.969350 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-logs\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.969393 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.969417 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5nfp\" (UniqueName: \"kubernetes.io/projected/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-kube-api-access-k5nfp\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.969489 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-ceph\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.969520 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.969820 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.969978 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-logs\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.970014 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.975679 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-ceph\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.976196 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.984535 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.992861 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5nfp\" (UniqueName: \"kubernetes.io/projected/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-kube-api-access-k5nfp\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:51 crc kubenswrapper[4693]: I1204 10:04:51.994758 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.000227 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.043691 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.045066 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.050668 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.058967 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.172575 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9771ef24-bcd9-44bc-8e0a-d0d135b09057-logs\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.172736 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9771ef24-bcd9-44bc-8e0a-d0d135b09057-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.172777 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.172856 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9771ef24-bcd9-44bc-8e0a-d0d135b09057-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.172890 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9771ef24-bcd9-44bc-8e0a-d0d135b09057-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.172925 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdfzj\" (UniqueName: \"kubernetes.io/projected/9771ef24-bcd9-44bc-8e0a-d0d135b09057-kube-api-access-kdfzj\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.173008 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9771ef24-bcd9-44bc-8e0a-d0d135b09057-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.173044 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9771ef24-bcd9-44bc-8e0a-d0d135b09057-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.274538 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9771ef24-bcd9-44bc-8e0a-d0d135b09057-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.274602 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.274667 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9771ef24-bcd9-44bc-8e0a-d0d135b09057-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.274693 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9771ef24-bcd9-44bc-8e0a-d0d135b09057-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.274723 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdfzj\" (UniqueName: \"kubernetes.io/projected/9771ef24-bcd9-44bc-8e0a-d0d135b09057-kube-api-access-kdfzj\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.274777 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9771ef24-bcd9-44bc-8e0a-d0d135b09057-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.274799 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9771ef24-bcd9-44bc-8e0a-d0d135b09057-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.274862 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9771ef24-bcd9-44bc-8e0a-d0d135b09057-logs\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.275129 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.275329 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9771ef24-bcd9-44bc-8e0a-d0d135b09057-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.275998 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9771ef24-bcd9-44bc-8e0a-d0d135b09057-logs\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.279252 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9771ef24-bcd9-44bc-8e0a-d0d135b09057-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.280009 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9771ef24-bcd9-44bc-8e0a-d0d135b09057-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.284536 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9771ef24-bcd9-44bc-8e0a-d0d135b09057-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.289279 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9771ef24-bcd9-44bc-8e0a-d0d135b09057-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.291880 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdfzj\" (UniqueName: \"kubernetes.io/projected/9771ef24-bcd9-44bc-8e0a-d0d135b09057-kube-api-access-kdfzj\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.293825 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.307603 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:04:52 crc kubenswrapper[4693]: I1204 10:04:52.381780 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:04:53 crc kubenswrapper[4693]: I1204 10:04:53.241787 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:04:53 crc kubenswrapper[4693]: I1204 10:04:53.309538 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:04:57 crc kubenswrapper[4693]: I1204 10:04:57.715580 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" podUID="334678ba-c391-4eb7-a693-37bb8dde6c26" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: i/o timeout" Dec 04 10:05:00 crc kubenswrapper[4693]: E1204 10:05:00.198310 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 04 10:05:00 crc kubenswrapper[4693]: E1204 10:05:00.198928 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n65bhb8h7dh545h664h5bhf8h9dhd9h65dh66h67dhd5h68dh598h56h668h567h8fh57bh577h59bh8h95h9dh59fh598h577h648hf8h5bbh54fq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pz5xm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(edf292dd-e2f4-4e80-ab9a-3e548118489a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.202745 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.289768 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f7f97fc4f-7jrw9" Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.456044 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e9c528f-c36e-4182-9e00-1aca67f3d19c-config-data\") pod \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.456113 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e9c528f-c36e-4182-9e00-1aca67f3d19c-logs\") pod \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.456320 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9nh8\" (UniqueName: \"kubernetes.io/projected/8e9c528f-c36e-4182-9e00-1aca67f3d19c-kube-api-access-q9nh8\") pod \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.456403 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8e9c528f-c36e-4182-9e00-1aca67f3d19c-horizon-secret-key\") pod \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.456439 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e9c528f-c36e-4182-9e00-1aca67f3d19c-scripts\") pod \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\" (UID: \"8e9c528f-c36e-4182-9e00-1aca67f3d19c\") " Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.457234 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e9c528f-c36e-4182-9e00-1aca67f3d19c-scripts" (OuterVolumeSpecName: "scripts") pod "8e9c528f-c36e-4182-9e00-1aca67f3d19c" (UID: "8e9c528f-c36e-4182-9e00-1aca67f3d19c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.457362 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e9c528f-c36e-4182-9e00-1aca67f3d19c-config-data" (OuterVolumeSpecName: "config-data") pod "8e9c528f-c36e-4182-9e00-1aca67f3d19c" (UID: "8e9c528f-c36e-4182-9e00-1aca67f3d19c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.457632 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e9c528f-c36e-4182-9e00-1aca67f3d19c-logs" (OuterVolumeSpecName: "logs") pod "8e9c528f-c36e-4182-9e00-1aca67f3d19c" (UID: "8e9c528f-c36e-4182-9e00-1aca67f3d19c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.465661 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e9c528f-c36e-4182-9e00-1aca67f3d19c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8e9c528f-c36e-4182-9e00-1aca67f3d19c" (UID: "8e9c528f-c36e-4182-9e00-1aca67f3d19c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.503026 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e9c528f-c36e-4182-9e00-1aca67f3d19c-kube-api-access-q9nh8" (OuterVolumeSpecName: "kube-api-access-q9nh8") pod "8e9c528f-c36e-4182-9e00-1aca67f3d19c" (UID: "8e9c528f-c36e-4182-9e00-1aca67f3d19c"). InnerVolumeSpecName "kube-api-access-q9nh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.526977 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f7f97fc4f-7jrw9" event={"ID":"8e9c528f-c36e-4182-9e00-1aca67f3d19c","Type":"ContainerDied","Data":"94b0479c0851490a775e6ee11ff2c694e0ae512de0cc81e3b3ceedb21a33a873"} Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.527151 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f7f97fc4f-7jrw9" Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.559650 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e9c528f-c36e-4182-9e00-1aca67f3d19c-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.559691 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e9c528f-c36e-4182-9e00-1aca67f3d19c-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.559707 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9nh8\" (UniqueName: \"kubernetes.io/projected/8e9c528f-c36e-4182-9e00-1aca67f3d19c-kube-api-access-q9nh8\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.559718 4693 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8e9c528f-c36e-4182-9e00-1aca67f3d19c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.559732 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e9c528f-c36e-4182-9e00-1aca67f3d19c-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.623468 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f7f97fc4f-7jrw9"] Dec 04 10:05:00 crc kubenswrapper[4693]: I1204 10:05:00.631991 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6f7f97fc4f-7jrw9"] Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.282092 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-845b6c7c9c-kdkqz" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.297664 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.302892 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-547d7c7775-rc6qr" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.395566 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndwzg\" (UniqueName: \"kubernetes.io/projected/334678ba-c391-4eb7-a693-37bb8dde6c26-kube-api-access-ndwzg\") pod \"334678ba-c391-4eb7-a693-37bb8dde6c26\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.395646 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-ovsdbserver-nb\") pod \"334678ba-c391-4eb7-a693-37bb8dde6c26\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.395669 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-ovsdbserver-sb\") pod \"334678ba-c391-4eb7-a693-37bb8dde6c26\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.395706 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0497c95b-05c5-468e-a596-4636ae376f5d-logs\") pod \"0497c95b-05c5-468e-a596-4636ae376f5d\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.395766 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0497c95b-05c5-468e-a596-4636ae376f5d-config-data\") pod \"0497c95b-05c5-468e-a596-4636ae376f5d\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.395809 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6w7s\" (UniqueName: \"kubernetes.io/projected/16682812-65a8-4824-a8ae-f3a0b9932ccd-kube-api-access-l6w7s\") pod \"16682812-65a8-4824-a8ae-f3a0b9932ccd\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.395832 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/16682812-65a8-4824-a8ae-f3a0b9932ccd-horizon-secret-key\") pod \"16682812-65a8-4824-a8ae-f3a0b9932ccd\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.395878 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0497c95b-05c5-468e-a596-4636ae376f5d-horizon-secret-key\") pod \"0497c95b-05c5-468e-a596-4636ae376f5d\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.395902 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-config\") pod \"334678ba-c391-4eb7-a693-37bb8dde6c26\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.396403 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0497c95b-05c5-468e-a596-4636ae376f5d-logs" (OuterVolumeSpecName: "logs") pod "0497c95b-05c5-468e-a596-4636ae376f5d" (UID: "0497c95b-05c5-468e-a596-4636ae376f5d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.396499 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0497c95b-05c5-468e-a596-4636ae376f5d-scripts\") pod \"0497c95b-05c5-468e-a596-4636ae376f5d\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.396552 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16682812-65a8-4824-a8ae-f3a0b9932ccd-logs\") pod \"16682812-65a8-4824-a8ae-f3a0b9932ccd\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.396565 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0497c95b-05c5-468e-a596-4636ae376f5d-config-data" (OuterVolumeSpecName: "config-data") pod "0497c95b-05c5-468e-a596-4636ae376f5d" (UID: "0497c95b-05c5-468e-a596-4636ae376f5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.396600 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/16682812-65a8-4824-a8ae-f3a0b9932ccd-scripts\") pod \"16682812-65a8-4824-a8ae-f3a0b9932ccd\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.396648 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fphf\" (UniqueName: \"kubernetes.io/projected/0497c95b-05c5-468e-a596-4636ae376f5d-kube-api-access-8fphf\") pod \"0497c95b-05c5-468e-a596-4636ae376f5d\" (UID: \"0497c95b-05c5-468e-a596-4636ae376f5d\") " Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.396695 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16682812-65a8-4824-a8ae-f3a0b9932ccd-config-data\") pod \"16682812-65a8-4824-a8ae-f3a0b9932ccd\" (UID: \"16682812-65a8-4824-a8ae-f3a0b9932ccd\") " Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.396720 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-dns-svc\") pod \"334678ba-c391-4eb7-a693-37bb8dde6c26\" (UID: \"334678ba-c391-4eb7-a693-37bb8dde6c26\") " Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.396978 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0497c95b-05c5-468e-a596-4636ae376f5d-scripts" (OuterVolumeSpecName: "scripts") pod "0497c95b-05c5-468e-a596-4636ae376f5d" (UID: "0497c95b-05c5-468e-a596-4636ae376f5d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.397313 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16682812-65a8-4824-a8ae-f3a0b9932ccd-logs" (OuterVolumeSpecName: "logs") pod "16682812-65a8-4824-a8ae-f3a0b9932ccd" (UID: "16682812-65a8-4824-a8ae-f3a0b9932ccd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.397708 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0497c95b-05c5-468e-a596-4636ae376f5d-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.397731 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0497c95b-05c5-468e-a596-4636ae376f5d-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.397747 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0497c95b-05c5-468e-a596-4636ae376f5d-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.397759 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16682812-65a8-4824-a8ae-f3a0b9932ccd-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.397829 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16682812-65a8-4824-a8ae-f3a0b9932ccd-scripts" (OuterVolumeSpecName: "scripts") pod "16682812-65a8-4824-a8ae-f3a0b9932ccd" (UID: "16682812-65a8-4824-a8ae-f3a0b9932ccd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.398277 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16682812-65a8-4824-a8ae-f3a0b9932ccd-config-data" (OuterVolumeSpecName: "config-data") pod "16682812-65a8-4824-a8ae-f3a0b9932ccd" (UID: "16682812-65a8-4824-a8ae-f3a0b9932ccd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.401576 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16682812-65a8-4824-a8ae-f3a0b9932ccd-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "16682812-65a8-4824-a8ae-f3a0b9932ccd" (UID: "16682812-65a8-4824-a8ae-f3a0b9932ccd"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.402158 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16682812-65a8-4824-a8ae-f3a0b9932ccd-kube-api-access-l6w7s" (OuterVolumeSpecName: "kube-api-access-l6w7s") pod "16682812-65a8-4824-a8ae-f3a0b9932ccd" (UID: "16682812-65a8-4824-a8ae-f3a0b9932ccd"). InnerVolumeSpecName "kube-api-access-l6w7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.402756 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0497c95b-05c5-468e-a596-4636ae376f5d-kube-api-access-8fphf" (OuterVolumeSpecName: "kube-api-access-8fphf") pod "0497c95b-05c5-468e-a596-4636ae376f5d" (UID: "0497c95b-05c5-468e-a596-4636ae376f5d"). InnerVolumeSpecName "kube-api-access-8fphf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.403323 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0497c95b-05c5-468e-a596-4636ae376f5d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0497c95b-05c5-468e-a596-4636ae376f5d" (UID: "0497c95b-05c5-468e-a596-4636ae376f5d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.405481 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/334678ba-c391-4eb7-a693-37bb8dde6c26-kube-api-access-ndwzg" (OuterVolumeSpecName: "kube-api-access-ndwzg") pod "334678ba-c391-4eb7-a693-37bb8dde6c26" (UID: "334678ba-c391-4eb7-a693-37bb8dde6c26"). InnerVolumeSpecName "kube-api-access-ndwzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.439703 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "334678ba-c391-4eb7-a693-37bb8dde6c26" (UID: "334678ba-c391-4eb7-a693-37bb8dde6c26"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.449186 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "334678ba-c391-4eb7-a693-37bb8dde6c26" (UID: "334678ba-c391-4eb7-a693-37bb8dde6c26"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.451139 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-config" (OuterVolumeSpecName: "config") pod "334678ba-c391-4eb7-a693-37bb8dde6c26" (UID: "334678ba-c391-4eb7-a693-37bb8dde6c26"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.455227 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "334678ba-c391-4eb7-a693-37bb8dde6c26" (UID: "334678ba-c391-4eb7-a693-37bb8dde6c26"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.472786 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e9c528f-c36e-4182-9e00-1aca67f3d19c" path="/var/lib/kubelet/pods/8e9c528f-c36e-4182-9e00-1aca67f3d19c/volumes" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.499709 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndwzg\" (UniqueName: \"kubernetes.io/projected/334678ba-c391-4eb7-a693-37bb8dde6c26-kube-api-access-ndwzg\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.499757 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.499778 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.499799 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6w7s\" (UniqueName: \"kubernetes.io/projected/16682812-65a8-4824-a8ae-f3a0b9932ccd-kube-api-access-l6w7s\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.499817 4693 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/16682812-65a8-4824-a8ae-f3a0b9932ccd-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.499833 4693 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0497c95b-05c5-468e-a596-4636ae376f5d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.499852 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.499870 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/16682812-65a8-4824-a8ae-f3a0b9932ccd-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.499890 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fphf\" (UniqueName: \"kubernetes.io/projected/0497c95b-05c5-468e-a596-4636ae376f5d-kube-api-access-8fphf\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.499907 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16682812-65a8-4824-a8ae-f3a0b9932ccd-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.499925 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/334678ba-c391-4eb7-a693-37bb8dde6c26-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.543791 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.543986 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" event={"ID":"334678ba-c391-4eb7-a693-37bb8dde6c26","Type":"ContainerDied","Data":"409963b736a1973c6fc765a11a800168eda51124720dda32d86376488c3febc9"} Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.544098 4693 scope.go:117] "RemoveContainer" containerID="e0228ed7b3806a077c1d1ecb90531aeadea88cbcd2c784cb1a25ac1bf04b7ee8" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.544878 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-547d7c7775-rc6qr" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.545613 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-845b6c7c9c-kdkqz" Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.544819 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-547d7c7775-rc6qr" event={"ID":"16682812-65a8-4824-a8ae-f3a0b9932ccd","Type":"ContainerDied","Data":"ba800a2d33e0804c52d37676458fd10b1d56cd887ac970316ff19cb3feac3380"} Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.545823 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-845b6c7c9c-kdkqz" event={"ID":"0497c95b-05c5-468e-a596-4636ae376f5d","Type":"ContainerDied","Data":"dd4a6f8ba151234f4c321d07726e93739d7144e26fb52129dd412d3b71a2e821"} Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.572844 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-8vk68"] Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.583190 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-8vk68"] Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.604841 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-845b6c7c9c-kdkqz"] Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.609468 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-845b6c7c9c-kdkqz"] Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.640578 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-547d7c7775-rc6qr"] Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.647288 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-547d7c7775-rc6qr"] Dec 04 10:05:02 crc kubenswrapper[4693]: I1204 10:05:02.716489 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-8vk68" podUID="334678ba-c391-4eb7-a693-37bb8dde6c26" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: i/o timeout" Dec 04 10:05:04 crc kubenswrapper[4693]: I1204 10:05:04.473672 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0497c95b-05c5-468e-a596-4636ae376f5d" path="/var/lib/kubelet/pods/0497c95b-05c5-468e-a596-4636ae376f5d/volumes" Dec 04 10:05:04 crc kubenswrapper[4693]: I1204 10:05:04.474478 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16682812-65a8-4824-a8ae-f3a0b9932ccd" path="/var/lib/kubelet/pods/16682812-65a8-4824-a8ae-f3a0b9932ccd/volumes" Dec 04 10:05:04 crc kubenswrapper[4693]: I1204 10:05:04.474836 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="334678ba-c391-4eb7-a693-37bb8dde6c26" path="/var/lib/kubelet/pods/334678ba-c391-4eb7-a693-37bb8dde6c26/volumes" Dec 04 10:05:04 crc kubenswrapper[4693]: E1204 10:05:04.869745 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 04 10:05:04 crc kubenswrapper[4693]: E1204 10:05:04.870245 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6tmz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-p69cr_openstack(a62cb864-103c-4b89-afeb-8397af4046cb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:05:04 crc kubenswrapper[4693]: E1204 10:05:04.871467 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-p69cr" podUID="a62cb864-103c-4b89-afeb-8397af4046cb" Dec 04 10:05:05 crc kubenswrapper[4693]: E1204 10:05:05.573826 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-p69cr" podUID="a62cb864-103c-4b89-afeb-8397af4046cb" Dec 04 10:05:05 crc kubenswrapper[4693]: E1204 10:05:05.828969 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 04 10:05:05 crc kubenswrapper[4693]: E1204 10:05:05.829150 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nbh6c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-q642m_openstack(832e603c-b695-442e-bcf6-fa322cfc1524): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:05:05 crc kubenswrapper[4693]: E1204 10:05:05.830352 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-q642m" podUID="832e603c-b695-442e-bcf6-fa322cfc1524" Dec 04 10:05:05 crc kubenswrapper[4693]: I1204 10:05:05.871145 4693 scope.go:117] "RemoveContainer" containerID="65c9483e3be87bc6966efc438372ef50398e641b2efc5a5336187e8627af09a1" Dec 04 10:05:06 crc kubenswrapper[4693]: I1204 10:05:06.284854 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f8cd9d6cb-vf5bx"] Dec 04 10:05:06 crc kubenswrapper[4693]: I1204 10:05:06.319703 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7649787bc6-fddzc"] Dec 04 10:05:06 crc kubenswrapper[4693]: I1204 10:05:06.516284 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:05:06 crc kubenswrapper[4693]: W1204 10:05:06.559530 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9771ef24_bcd9_44bc_8e0a_d0d135b09057.slice/crio-748c0254d9d20c3c8bcc18ae2e406d0e8b979afd1a3de23835f98192e0abfb94 WatchSource:0}: Error finding container 748c0254d9d20c3c8bcc18ae2e406d0e8b979afd1a3de23835f98192e0abfb94: Status 404 returned error can't find the container with id 748c0254d9d20c3c8bcc18ae2e406d0e8b979afd1a3de23835f98192e0abfb94 Dec 04 10:05:06 crc kubenswrapper[4693]: I1204 10:05:06.582458 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-srzft" event={"ID":"af7a03e1-cb13-4536-9405-791381101cdc","Type":"ContainerStarted","Data":"3b7577f95e018f83d92ce7dc59d26f9be7ad89e895504b1685de208e6d55b1c3"} Dec 04 10:05:06 crc kubenswrapper[4693]: I1204 10:05:06.587499 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f8cd9d6cb-vf5bx" event={"ID":"ca2592b0-5dfd-4d15-996c-2340af86bd26","Type":"ContainerStarted","Data":"d1e98e4e6cbd600a64ce8fca5a3fa0e68f96435b2ddae9655549763201d60b5f"} Dec 04 10:05:06 crc kubenswrapper[4693]: I1204 10:05:06.589165 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7649787bc6-fddzc" event={"ID":"5d096d1f-bf52-413c-9cb9-4c89179e5725","Type":"ContainerStarted","Data":"60f84246054c776836d04a6e91722840cd292470a9ca6050d3f9e2f19353f533"} Dec 04 10:05:06 crc kubenswrapper[4693]: I1204 10:05:06.592861 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9771ef24-bcd9-44bc-8e0a-d0d135b09057","Type":"ContainerStarted","Data":"748c0254d9d20c3c8bcc18ae2e406d0e8b979afd1a3de23835f98192e0abfb94"} Dec 04 10:05:06 crc kubenswrapper[4693]: I1204 10:05:06.595815 4693 generic.go:334] "Generic (PLEG): container finished" podID="228babee-6748-4512-bd76-92168eab2e2d" containerID="bff8e3ebeba716474615ecef4839fdd6eea301daa1ae5f089ff8b5e4b1c0f79b" exitCode=0 Dec 04 10:05:06 crc kubenswrapper[4693]: I1204 10:05:06.595958 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fc99g" event={"ID":"228babee-6748-4512-bd76-92168eab2e2d","Type":"ContainerDied","Data":"bff8e3ebeba716474615ecef4839fdd6eea301daa1ae5f089ff8b5e4b1c0f79b"} Dec 04 10:05:06 crc kubenswrapper[4693]: I1204 10:05:06.597457 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-srzft" podStartSLOduration=6.629582988 podStartE2EDuration="41.597446106s" podCreationTimestamp="2025-12-04 10:04:25 +0000 UTC" firstStartedPulling="2025-12-04 10:04:27.208424304 +0000 UTC m=+1313.106018057" lastFinishedPulling="2025-12-04 10:05:02.176287422 +0000 UTC m=+1348.073881175" observedRunningTime="2025-12-04 10:05:06.597190249 +0000 UTC m=+1352.494784002" watchObservedRunningTime="2025-12-04 10:05:06.597446106 +0000 UTC m=+1352.495039869" Dec 04 10:05:06 crc kubenswrapper[4693]: E1204 10:05:06.598600 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-q642m" podUID="832e603c-b695-442e-bcf6-fa322cfc1524" Dec 04 10:05:06 crc kubenswrapper[4693]: I1204 10:05:06.683728 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-9t76h"] Dec 04 10:05:06 crc kubenswrapper[4693]: I1204 10:05:06.694143 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jw5j4"] Dec 04 10:05:06 crc kubenswrapper[4693]: W1204 10:05:06.702982 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca4b19dd_d54c_4531_aeca_8cb22716b387.slice/crio-a9fe8f6da4de41e73d3497d7dfe5beb8e237137e5be6b9b1fb7cf2cc7688907a WatchSource:0}: Error finding container a9fe8f6da4de41e73d3497d7dfe5beb8e237137e5be6b9b1fb7cf2cc7688907a: Status 404 returned error can't find the container with id a9fe8f6da4de41e73d3497d7dfe5beb8e237137e5be6b9b1fb7cf2cc7688907a Dec 04 10:05:07 crc kubenswrapper[4693]: I1204 10:05:07.290083 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:05:07 crc kubenswrapper[4693]: W1204 10:05:07.300211 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0279d22f_676e_45a1_b5d9_7dd7d8efc4b7.slice/crio-0b16648783dd4fd58ad0b01954e5b3d23334f51dde8f2a5fa5ff493e19bd1b6f WatchSource:0}: Error finding container 0b16648783dd4fd58ad0b01954e5b3d23334f51dde8f2a5fa5ff493e19bd1b6f: Status 404 returned error can't find the container with id 0b16648783dd4fd58ad0b01954e5b3d23334f51dde8f2a5fa5ff493e19bd1b6f Dec 04 10:05:07 crc kubenswrapper[4693]: I1204 10:05:07.637572 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edf292dd-e2f4-4e80-ab9a-3e548118489a","Type":"ContainerStarted","Data":"abbc17a2851c37f8bc1cc8459cd7a0c1f319183cde18e70e698a76638d199c1c"} Dec 04 10:05:07 crc kubenswrapper[4693]: I1204 10:05:07.639671 4693 generic.go:334] "Generic (PLEG): container finished" podID="ca4b19dd-d54c-4531-aeca-8cb22716b387" containerID="4cf63db4672c83617b4e662fac1194a6080551badb627c3941b766b561f0714e" exitCode=0 Dec 04 10:05:07 crc kubenswrapper[4693]: I1204 10:05:07.639722 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" event={"ID":"ca4b19dd-d54c-4531-aeca-8cb22716b387","Type":"ContainerDied","Data":"4cf63db4672c83617b4e662fac1194a6080551badb627c3941b766b561f0714e"} Dec 04 10:05:07 crc kubenswrapper[4693]: I1204 10:05:07.639740 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" event={"ID":"ca4b19dd-d54c-4531-aeca-8cb22716b387","Type":"ContainerStarted","Data":"a9fe8f6da4de41e73d3497d7dfe5beb8e237137e5be6b9b1fb7cf2cc7688907a"} Dec 04 10:05:07 crc kubenswrapper[4693]: I1204 10:05:07.640893 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7","Type":"ContainerStarted","Data":"0b16648783dd4fd58ad0b01954e5b3d23334f51dde8f2a5fa5ff493e19bd1b6f"} Dec 04 10:05:07 crc kubenswrapper[4693]: I1204 10:05:07.644645 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jw5j4" event={"ID":"00f3c238-cf53-4a99-96da-ae6118b711b4","Type":"ContainerStarted","Data":"cf6afee287e4f350cd3e182859a40e414bd245cc2f53f5a178eb6dca2920a8c6"} Dec 04 10:05:07 crc kubenswrapper[4693]: I1204 10:05:07.644700 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jw5j4" event={"ID":"00f3c238-cf53-4a99-96da-ae6118b711b4","Type":"ContainerStarted","Data":"ea6fd7e226a139509f416f5e5033c4bf3741503588dfff26fda431cb2531c605"} Dec 04 10:05:07 crc kubenswrapper[4693]: I1204 10:05:07.646708 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f8cd9d6cb-vf5bx" event={"ID":"ca2592b0-5dfd-4d15-996c-2340af86bd26","Type":"ContainerStarted","Data":"f23675ee132ad009d3b6056cac2048614984627c91b61b2fada6d51aa724861b"} Dec 04 10:05:07 crc kubenswrapper[4693]: I1204 10:05:07.646753 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f8cd9d6cb-vf5bx" event={"ID":"ca2592b0-5dfd-4d15-996c-2340af86bd26","Type":"ContainerStarted","Data":"6c6e21039bf0a99aa3a86d4823dd691e24f45e744212a54690a56aa7c4d85dd2"} Dec 04 10:05:07 crc kubenswrapper[4693]: I1204 10:05:07.654484 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7649787bc6-fddzc" event={"ID":"5d096d1f-bf52-413c-9cb9-4c89179e5725","Type":"ContainerStarted","Data":"f1cd933525400a784d0d5d17a18b7c228b22829e893d0fb79dbba4b73a786655"} Dec 04 10:05:07 crc kubenswrapper[4693]: I1204 10:05:07.654530 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7649787bc6-fddzc" event={"ID":"5d096d1f-bf52-413c-9cb9-4c89179e5725","Type":"ContainerStarted","Data":"e98854c320adffc66ab9b3f27212c1efb5e2d439a4194f947da8e97aa18a8e59"} Dec 04 10:05:07 crc kubenswrapper[4693]: I1204 10:05:07.672602 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lvx99" event={"ID":"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d","Type":"ContainerStarted","Data":"ceeddbea04eada8185ac7490c16a472d90714892ff0cec6ab7c2bb7bc1fcc931"} Dec 04 10:05:07 crc kubenswrapper[4693]: I1204 10:05:07.686816 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9771ef24-bcd9-44bc-8e0a-d0d135b09057","Type":"ContainerStarted","Data":"d0451dd7a0a5054ca4255b210f8647b30c55b6cea2c437b7ad2c00b8515ff64d"} Dec 04 10:05:07 crc kubenswrapper[4693]: I1204 10:05:07.688842 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6f8cd9d6cb-vf5bx" podStartSLOduration=33.171796732 podStartE2EDuration="33.688822116s" podCreationTimestamp="2025-12-04 10:04:34 +0000 UTC" firstStartedPulling="2025-12-04 10:05:06.358732656 +0000 UTC m=+1352.256326409" lastFinishedPulling="2025-12-04 10:05:06.87575803 +0000 UTC m=+1352.773351793" observedRunningTime="2025-12-04 10:05:07.679234581 +0000 UTC m=+1353.576828344" watchObservedRunningTime="2025-12-04 10:05:07.688822116 +0000 UTC m=+1353.586415869" Dec 04 10:05:07 crc kubenswrapper[4693]: I1204 10:05:07.717303 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7649787bc6-fddzc" podStartSLOduration=33.167932106 podStartE2EDuration="33.717286762s" podCreationTimestamp="2025-12-04 10:04:34 +0000 UTC" firstStartedPulling="2025-12-04 10:05:06.3588833 +0000 UTC m=+1352.256477053" lastFinishedPulling="2025-12-04 10:05:06.908237956 +0000 UTC m=+1352.805831709" observedRunningTime="2025-12-04 10:05:07.699721377 +0000 UTC m=+1353.597315140" watchObservedRunningTime="2025-12-04 10:05:07.717286762 +0000 UTC m=+1353.614880515" Dec 04 10:05:07 crc kubenswrapper[4693]: I1204 10:05:07.736508 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jw5j4" podStartSLOduration=17.736483132 podStartE2EDuration="17.736483132s" podCreationTimestamp="2025-12-04 10:04:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:05:07.7157755 +0000 UTC m=+1353.613369253" watchObservedRunningTime="2025-12-04 10:05:07.736483132 +0000 UTC m=+1353.634076905" Dec 04 10:05:07 crc kubenswrapper[4693]: I1204 10:05:07.765572 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-lvx99" podStartSLOduration=3.857367837 podStartE2EDuration="42.765549385s" podCreationTimestamp="2025-12-04 10:04:25 +0000 UTC" firstStartedPulling="2025-12-04 10:04:27.380155595 +0000 UTC m=+1313.277749348" lastFinishedPulling="2025-12-04 10:05:06.288337143 +0000 UTC m=+1352.185930896" observedRunningTime="2025-12-04 10:05:07.731024061 +0000 UTC m=+1353.628617844" watchObservedRunningTime="2025-12-04 10:05:07.765549385 +0000 UTC m=+1353.663143138" Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.172803 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fc99g" Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.324880 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228babee-6748-4512-bd76-92168eab2e2d-combined-ca-bundle\") pod \"228babee-6748-4512-bd76-92168eab2e2d\" (UID: \"228babee-6748-4512-bd76-92168eab2e2d\") " Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.325080 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjlpg\" (UniqueName: \"kubernetes.io/projected/228babee-6748-4512-bd76-92168eab2e2d-kube-api-access-vjlpg\") pod \"228babee-6748-4512-bd76-92168eab2e2d\" (UID: \"228babee-6748-4512-bd76-92168eab2e2d\") " Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.325157 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/228babee-6748-4512-bd76-92168eab2e2d-config\") pod \"228babee-6748-4512-bd76-92168eab2e2d\" (UID: \"228babee-6748-4512-bd76-92168eab2e2d\") " Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.337075 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/228babee-6748-4512-bd76-92168eab2e2d-kube-api-access-vjlpg" (OuterVolumeSpecName: "kube-api-access-vjlpg") pod "228babee-6748-4512-bd76-92168eab2e2d" (UID: "228babee-6748-4512-bd76-92168eab2e2d"). InnerVolumeSpecName "kube-api-access-vjlpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.389927 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228babee-6748-4512-bd76-92168eab2e2d-config" (OuterVolumeSpecName: "config") pod "228babee-6748-4512-bd76-92168eab2e2d" (UID: "228babee-6748-4512-bd76-92168eab2e2d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.400525 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228babee-6748-4512-bd76-92168eab2e2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "228babee-6748-4512-bd76-92168eab2e2d" (UID: "228babee-6748-4512-bd76-92168eab2e2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.428413 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228babee-6748-4512-bd76-92168eab2e2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.428454 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjlpg\" (UniqueName: \"kubernetes.io/projected/228babee-6748-4512-bd76-92168eab2e2d-kube-api-access-vjlpg\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.428469 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/228babee-6748-4512-bd76-92168eab2e2d-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.705812 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" event={"ID":"ca4b19dd-d54c-4531-aeca-8cb22716b387","Type":"ContainerStarted","Data":"4899be29e04db81fdf8613942d0a26341615fa4433dc5a05907485ee7e6becbf"} Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.705890 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.713216 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7","Type":"ContainerStarted","Data":"acd0140f961a2e191b22c4f5afdb948bf7b6a89137979415061fdf6f5e7abcdb"} Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.727642 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9771ef24-bcd9-44bc-8e0a-d0d135b09057","Type":"ContainerStarted","Data":"fa1b7e5a2bd0c08b6e9876e364ac5e796def40e1de549990f062e11a3efe80cc"} Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.727794 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9771ef24-bcd9-44bc-8e0a-d0d135b09057" containerName="glance-log" containerID="cri-o://d0451dd7a0a5054ca4255b210f8647b30c55b6cea2c437b7ad2c00b8515ff64d" gracePeriod=30 Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.727981 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9771ef24-bcd9-44bc-8e0a-d0d135b09057" containerName="glance-httpd" containerID="cri-o://fa1b7e5a2bd0c08b6e9876e364ac5e796def40e1de549990f062e11a3efe80cc" gracePeriod=30 Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.740496 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fc99g" event={"ID":"228babee-6748-4512-bd76-92168eab2e2d","Type":"ContainerDied","Data":"a1dabcf3ca36ec22171d881e79f15ddebc3283b600489e4bf6b5f75cdce165a0"} Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.740613 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1dabcf3ca36ec22171d881e79f15ddebc3283b600489e4bf6b5f75cdce165a0" Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.745026 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" podStartSLOduration=18.745006864 podStartE2EDuration="18.745006864s" podCreationTimestamp="2025-12-04 10:04:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:05:08.740989813 +0000 UTC m=+1354.638583566" watchObservedRunningTime="2025-12-04 10:05:08.745006864 +0000 UTC m=+1354.642600617" Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.747306 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fc99g" Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.820681 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=17.820664303 podStartE2EDuration="17.820664303s" podCreationTimestamp="2025-12-04 10:04:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:05:08.819968104 +0000 UTC m=+1354.717561857" watchObservedRunningTime="2025-12-04 10:05:08.820664303 +0000 UTC m=+1354.718258056" Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.931250 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-9t76h"] Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.964524 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-lsdbk"] Dec 04 10:05:08 crc kubenswrapper[4693]: E1204 10:05:08.973520 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="334678ba-c391-4eb7-a693-37bb8dde6c26" containerName="init" Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.973607 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="334678ba-c391-4eb7-a693-37bb8dde6c26" containerName="init" Dec 04 10:05:08 crc kubenswrapper[4693]: E1204 10:05:08.973696 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="228babee-6748-4512-bd76-92168eab2e2d" containerName="neutron-db-sync" Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.973755 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="228babee-6748-4512-bd76-92168eab2e2d" containerName="neutron-db-sync" Dec 04 10:05:08 crc kubenswrapper[4693]: E1204 10:05:08.973805 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="334678ba-c391-4eb7-a693-37bb8dde6c26" containerName="dnsmasq-dns" Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.973854 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="334678ba-c391-4eb7-a693-37bb8dde6c26" containerName="dnsmasq-dns" Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.974074 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="334678ba-c391-4eb7-a693-37bb8dde6c26" containerName="dnsmasq-dns" Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.974144 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="228babee-6748-4512-bd76-92168eab2e2d" containerName="neutron-db-sync" Dec 04 10:05:08 crc kubenswrapper[4693]: I1204 10:05:08.975156 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.006653 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-lsdbk"] Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.052569 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7ff87d85cb-w8vrw"] Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.070246 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.073076 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.073227 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.073738 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7ff87d85cb-w8vrw"] Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.076491 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.078755 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vbdj5" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.177631 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-config\") pod \"neutron-7ff87d85cb-w8vrw\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.177699 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9smvv\" (UniqueName: \"kubernetes.io/projected/9928acda-0163-4fde-8635-f861c13c43fb-kube-api-access-9smvv\") pod \"neutron-7ff87d85cb-w8vrw\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.177738 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54c66\" (UniqueName: \"kubernetes.io/projected/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-kube-api-access-54c66\") pod \"dnsmasq-dns-6b7b667979-lsdbk\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.177764 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-ovndb-tls-certs\") pod \"neutron-7ff87d85cb-w8vrw\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.177808 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-config\") pod \"dnsmasq-dns-6b7b667979-lsdbk\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.177836 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-lsdbk\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.177897 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-lsdbk\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.177947 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-dns-svc\") pod \"dnsmasq-dns-6b7b667979-lsdbk\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.177973 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-httpd-config\") pod \"neutron-7ff87d85cb-w8vrw\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.178110 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-lsdbk\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.178180 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-combined-ca-bundle\") pod \"neutron-7ff87d85cb-w8vrw\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.280237 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9smvv\" (UniqueName: \"kubernetes.io/projected/9928acda-0163-4fde-8635-f861c13c43fb-kube-api-access-9smvv\") pod \"neutron-7ff87d85cb-w8vrw\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.280305 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54c66\" (UniqueName: \"kubernetes.io/projected/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-kube-api-access-54c66\") pod \"dnsmasq-dns-6b7b667979-lsdbk\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.280347 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-ovndb-tls-certs\") pod \"neutron-7ff87d85cb-w8vrw\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.280396 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-config\") pod \"dnsmasq-dns-6b7b667979-lsdbk\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.280419 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-lsdbk\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.280462 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-lsdbk\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.280507 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-dns-svc\") pod \"dnsmasq-dns-6b7b667979-lsdbk\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.280528 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-httpd-config\") pod \"neutron-7ff87d85cb-w8vrw\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.280561 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-lsdbk\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.280612 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-combined-ca-bundle\") pod \"neutron-7ff87d85cb-w8vrw\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.280643 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-config\") pod \"neutron-7ff87d85cb-w8vrw\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.281759 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-config\") pod \"dnsmasq-dns-6b7b667979-lsdbk\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.282416 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-lsdbk\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.288095 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-lsdbk\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.288778 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-lsdbk\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.289228 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-ovndb-tls-certs\") pod \"neutron-7ff87d85cb-w8vrw\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.290098 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-config\") pod \"neutron-7ff87d85cb-w8vrw\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.290107 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-dns-svc\") pod \"dnsmasq-dns-6b7b667979-lsdbk\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.302186 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-combined-ca-bundle\") pod \"neutron-7ff87d85cb-w8vrw\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.302699 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-httpd-config\") pod \"neutron-7ff87d85cb-w8vrw\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.320003 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54c66\" (UniqueName: \"kubernetes.io/projected/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-kube-api-access-54c66\") pod \"dnsmasq-dns-6b7b667979-lsdbk\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.330672 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9smvv\" (UniqueName: \"kubernetes.io/projected/9928acda-0163-4fde-8635-f861c13c43fb-kube-api-access-9smvv\") pod \"neutron-7ff87d85cb-w8vrw\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.477061 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.499191 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.701919 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.764584 4693 generic.go:334] "Generic (PLEG): container finished" podID="9771ef24-bcd9-44bc-8e0a-d0d135b09057" containerID="fa1b7e5a2bd0c08b6e9876e364ac5e796def40e1de549990f062e11a3efe80cc" exitCode=0 Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.764613 4693 generic.go:334] "Generic (PLEG): container finished" podID="9771ef24-bcd9-44bc-8e0a-d0d135b09057" containerID="d0451dd7a0a5054ca4255b210f8647b30c55b6cea2c437b7ad2c00b8515ff64d" exitCode=143 Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.764657 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9771ef24-bcd9-44bc-8e0a-d0d135b09057","Type":"ContainerDied","Data":"fa1b7e5a2bd0c08b6e9876e364ac5e796def40e1de549990f062e11a3efe80cc"} Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.764684 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9771ef24-bcd9-44bc-8e0a-d0d135b09057","Type":"ContainerDied","Data":"d0451dd7a0a5054ca4255b210f8647b30c55b6cea2c437b7ad2c00b8515ff64d"} Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.764695 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9771ef24-bcd9-44bc-8e0a-d0d135b09057","Type":"ContainerDied","Data":"748c0254d9d20c3c8bcc18ae2e406d0e8b979afd1a3de23835f98192e0abfb94"} Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.764709 4693 scope.go:117] "RemoveContainer" containerID="fa1b7e5a2bd0c08b6e9876e364ac5e796def40e1de549990f062e11a3efe80cc" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.764826 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.771769 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0279d22f-676e-45a1-b5d9-7dd7d8efc4b7" containerName="glance-log" containerID="cri-o://acd0140f961a2e191b22c4f5afdb948bf7b6a89137979415061fdf6f5e7abcdb" gracePeriod=30 Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.771857 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7","Type":"ContainerStarted","Data":"1efa651f3ce77f023ada267a0cb6fe5262ec5461f91068402157c66635780e1f"} Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.771867 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0279d22f-676e-45a1-b5d9-7dd7d8efc4b7" containerName="glance-httpd" containerID="cri-o://1efa651f3ce77f023ada267a0cb6fe5262ec5461f91068402157c66635780e1f" gracePeriod=30 Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.802634 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=19.802601342 podStartE2EDuration="19.802601342s" podCreationTimestamp="2025-12-04 10:04:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:05:09.791748922 +0000 UTC m=+1355.689342685" watchObservedRunningTime="2025-12-04 10:05:09.802601342 +0000 UTC m=+1355.700195095" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.817180 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9771ef24-bcd9-44bc-8e0a-d0d135b09057-scripts\") pod \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.817261 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.817315 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdfzj\" (UniqueName: \"kubernetes.io/projected/9771ef24-bcd9-44bc-8e0a-d0d135b09057-kube-api-access-kdfzj\") pod \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.817369 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9771ef24-bcd9-44bc-8e0a-d0d135b09057-combined-ca-bundle\") pod \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.817405 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9771ef24-bcd9-44bc-8e0a-d0d135b09057-httpd-run\") pod \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.818907 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9771ef24-bcd9-44bc-8e0a-d0d135b09057-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9771ef24-bcd9-44bc-8e0a-d0d135b09057" (UID: "9771ef24-bcd9-44bc-8e0a-d0d135b09057"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.821809 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9771ef24-bcd9-44bc-8e0a-d0d135b09057-ceph\") pod \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.821883 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9771ef24-bcd9-44bc-8e0a-d0d135b09057-logs\") pod \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.821977 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9771ef24-bcd9-44bc-8e0a-d0d135b09057-config-data\") pod \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\" (UID: \"9771ef24-bcd9-44bc-8e0a-d0d135b09057\") " Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.822833 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9771ef24-bcd9-44bc-8e0a-d0d135b09057-logs" (OuterVolumeSpecName: "logs") pod "9771ef24-bcd9-44bc-8e0a-d0d135b09057" (UID: "9771ef24-bcd9-44bc-8e0a-d0d135b09057"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.823436 4693 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9771ef24-bcd9-44bc-8e0a-d0d135b09057-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.823454 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9771ef24-bcd9-44bc-8e0a-d0d135b09057-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.830713 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9771ef24-bcd9-44bc-8e0a-d0d135b09057-scripts" (OuterVolumeSpecName: "scripts") pod "9771ef24-bcd9-44bc-8e0a-d0d135b09057" (UID: "9771ef24-bcd9-44bc-8e0a-d0d135b09057"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.830773 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9771ef24-bcd9-44bc-8e0a-d0d135b09057-ceph" (OuterVolumeSpecName: "ceph") pod "9771ef24-bcd9-44bc-8e0a-d0d135b09057" (UID: "9771ef24-bcd9-44bc-8e0a-d0d135b09057"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.830772 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "9771ef24-bcd9-44bc-8e0a-d0d135b09057" (UID: "9771ef24-bcd9-44bc-8e0a-d0d135b09057"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.864058 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9771ef24-bcd9-44bc-8e0a-d0d135b09057-kube-api-access-kdfzj" (OuterVolumeSpecName: "kube-api-access-kdfzj") pod "9771ef24-bcd9-44bc-8e0a-d0d135b09057" (UID: "9771ef24-bcd9-44bc-8e0a-d0d135b09057"). InnerVolumeSpecName "kube-api-access-kdfzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.864200 4693 scope.go:117] "RemoveContainer" containerID="d0451dd7a0a5054ca4255b210f8647b30c55b6cea2c437b7ad2c00b8515ff64d" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.870029 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9771ef24-bcd9-44bc-8e0a-d0d135b09057-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9771ef24-bcd9-44bc-8e0a-d0d135b09057" (UID: "9771ef24-bcd9-44bc-8e0a-d0d135b09057"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.915224 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9771ef24-bcd9-44bc-8e0a-d0d135b09057-config-data" (OuterVolumeSpecName: "config-data") pod "9771ef24-bcd9-44bc-8e0a-d0d135b09057" (UID: "9771ef24-bcd9-44bc-8e0a-d0d135b09057"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.924896 4693 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9771ef24-bcd9-44bc-8e0a-d0d135b09057-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.925124 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9771ef24-bcd9-44bc-8e0a-d0d135b09057-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.925133 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9771ef24-bcd9-44bc-8e0a-d0d135b09057-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.925155 4693 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.925165 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdfzj\" (UniqueName: \"kubernetes.io/projected/9771ef24-bcd9-44bc-8e0a-d0d135b09057-kube-api-access-kdfzj\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.925175 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9771ef24-bcd9-44bc-8e0a-d0d135b09057-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.960788 4693 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.983863 4693 scope.go:117] "RemoveContainer" containerID="fa1b7e5a2bd0c08b6e9876e364ac5e796def40e1de549990f062e11a3efe80cc" Dec 04 10:05:09 crc kubenswrapper[4693]: E1204 10:05:09.987939 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa1b7e5a2bd0c08b6e9876e364ac5e796def40e1de549990f062e11a3efe80cc\": container with ID starting with fa1b7e5a2bd0c08b6e9876e364ac5e796def40e1de549990f062e11a3efe80cc not found: ID does not exist" containerID="fa1b7e5a2bd0c08b6e9876e364ac5e796def40e1de549990f062e11a3efe80cc" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.987986 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa1b7e5a2bd0c08b6e9876e364ac5e796def40e1de549990f062e11a3efe80cc"} err="failed to get container status \"fa1b7e5a2bd0c08b6e9876e364ac5e796def40e1de549990f062e11a3efe80cc\": rpc error: code = NotFound desc = could not find container \"fa1b7e5a2bd0c08b6e9876e364ac5e796def40e1de549990f062e11a3efe80cc\": container with ID starting with fa1b7e5a2bd0c08b6e9876e364ac5e796def40e1de549990f062e11a3efe80cc not found: ID does not exist" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.988014 4693 scope.go:117] "RemoveContainer" containerID="d0451dd7a0a5054ca4255b210f8647b30c55b6cea2c437b7ad2c00b8515ff64d" Dec 04 10:05:09 crc kubenswrapper[4693]: E1204 10:05:09.988510 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0451dd7a0a5054ca4255b210f8647b30c55b6cea2c437b7ad2c00b8515ff64d\": container with ID starting with d0451dd7a0a5054ca4255b210f8647b30c55b6cea2c437b7ad2c00b8515ff64d not found: ID does not exist" containerID="d0451dd7a0a5054ca4255b210f8647b30c55b6cea2c437b7ad2c00b8515ff64d" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.988530 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0451dd7a0a5054ca4255b210f8647b30c55b6cea2c437b7ad2c00b8515ff64d"} err="failed to get container status \"d0451dd7a0a5054ca4255b210f8647b30c55b6cea2c437b7ad2c00b8515ff64d\": rpc error: code = NotFound desc = could not find container \"d0451dd7a0a5054ca4255b210f8647b30c55b6cea2c437b7ad2c00b8515ff64d\": container with ID starting with d0451dd7a0a5054ca4255b210f8647b30c55b6cea2c437b7ad2c00b8515ff64d not found: ID does not exist" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.988543 4693 scope.go:117] "RemoveContainer" containerID="fa1b7e5a2bd0c08b6e9876e364ac5e796def40e1de549990f062e11a3efe80cc" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.988768 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa1b7e5a2bd0c08b6e9876e364ac5e796def40e1de549990f062e11a3efe80cc"} err="failed to get container status \"fa1b7e5a2bd0c08b6e9876e364ac5e796def40e1de549990f062e11a3efe80cc\": rpc error: code = NotFound desc = could not find container \"fa1b7e5a2bd0c08b6e9876e364ac5e796def40e1de549990f062e11a3efe80cc\": container with ID starting with fa1b7e5a2bd0c08b6e9876e364ac5e796def40e1de549990f062e11a3efe80cc not found: ID does not exist" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.988787 4693 scope.go:117] "RemoveContainer" containerID="d0451dd7a0a5054ca4255b210f8647b30c55b6cea2c437b7ad2c00b8515ff64d" Dec 04 10:05:09 crc kubenswrapper[4693]: I1204 10:05:09.989046 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0451dd7a0a5054ca4255b210f8647b30c55b6cea2c437b7ad2c00b8515ff64d"} err="failed to get container status \"d0451dd7a0a5054ca4255b210f8647b30c55b6cea2c437b7ad2c00b8515ff64d\": rpc error: code = NotFound desc = could not find container \"d0451dd7a0a5054ca4255b210f8647b30c55b6cea2c437b7ad2c00b8515ff64d\": container with ID starting with d0451dd7a0a5054ca4255b210f8647b30c55b6cea2c437b7ad2c00b8515ff64d not found: ID does not exist" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.027045 4693 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.104505 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.118393 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.132543 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:05:10 crc kubenswrapper[4693]: E1204 10:05:10.133252 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9771ef24-bcd9-44bc-8e0a-d0d135b09057" containerName="glance-log" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.133280 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9771ef24-bcd9-44bc-8e0a-d0d135b09057" containerName="glance-log" Dec 04 10:05:10 crc kubenswrapper[4693]: E1204 10:05:10.133347 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9771ef24-bcd9-44bc-8e0a-d0d135b09057" containerName="glance-httpd" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.133360 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9771ef24-bcd9-44bc-8e0a-d0d135b09057" containerName="glance-httpd" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.133615 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="9771ef24-bcd9-44bc-8e0a-d0d135b09057" containerName="glance-httpd" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.133651 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="9771ef24-bcd9-44bc-8e0a-d0d135b09057" containerName="glance-log" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.135010 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.138278 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.142015 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.144274 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.308178 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7ff87d85cb-w8vrw"] Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.331546 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.331601 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpzjq\" (UniqueName: \"kubernetes.io/projected/0c2dadd4-e719-4ec8-915e-683db6276f04-kube-api-access-zpzjq\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.331687 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.331712 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c2dadd4-e719-4ec8-915e-683db6276f04-logs\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.331752 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c2dadd4-e719-4ec8-915e-683db6276f04-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.331798 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.331823 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.331869 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.331909 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0c2dadd4-e719-4ec8-915e-683db6276f04-ceph\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.434751 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.434822 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0c2dadd4-e719-4ec8-915e-683db6276f04-ceph\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.434857 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.434876 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpzjq\" (UniqueName: \"kubernetes.io/projected/0c2dadd4-e719-4ec8-915e-683db6276f04-kube-api-access-zpzjq\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.434940 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.434964 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c2dadd4-e719-4ec8-915e-683db6276f04-logs\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.435005 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c2dadd4-e719-4ec8-915e-683db6276f04-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.435044 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.435064 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.436154 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.436536 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c2dadd4-e719-4ec8-915e-683db6276f04-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.437297 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c2dadd4-e719-4ec8-915e-683db6276f04-logs\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.439219 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.439804 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.440577 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.441038 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0c2dadd4-e719-4ec8-915e-683db6276f04-ceph\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.450948 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.455124 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpzjq\" (UniqueName: \"kubernetes.io/projected/0c2dadd4-e719-4ec8-915e-683db6276f04-kube-api-access-zpzjq\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.481955 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.481962 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9771ef24-bcd9-44bc-8e0a-d0d135b09057" path="/var/lib/kubelet/pods/9771ef24-bcd9-44bc-8e0a-d0d135b09057/volumes" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.496527 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.783862 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" podUID="ca4b19dd-d54c-4531-aeca-8cb22716b387" containerName="dnsmasq-dns" containerID="cri-o://4899be29e04db81fdf8613942d0a26341615fa4433dc5a05907485ee7e6becbf" gracePeriod=10 Dec 04 10:05:10 crc kubenswrapper[4693]: I1204 10:05:10.784270 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7ff87d85cb-w8vrw" event={"ID":"9928acda-0163-4fde-8635-f861c13c43fb","Type":"ContainerStarted","Data":"17baf77493703ae7a1fdb96be17fac4d139015d4f5b986ff8faa81b8bda7b5d6"} Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.036228 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:05:11 crc kubenswrapper[4693]: W1204 10:05:11.059481 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c2dadd4_e719_4ec8_915e_683db6276f04.slice/crio-922ec2c7f93c103088ce3cd5934ffef4268f2ef2ff72a70109b5e4b95f7d879e WatchSource:0}: Error finding container 922ec2c7f93c103088ce3cd5934ffef4268f2ef2ff72a70109b5e4b95f7d879e: Status 404 returned error can't find the container with id 922ec2c7f93c103088ce3cd5934ffef4268f2ef2ff72a70109b5e4b95f7d879e Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.288567 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-77d49c9649-fpwft"] Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.297501 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.303014 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.306861 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.330546 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77d49c9649-fpwft"] Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.458446 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c81aab-5f08-429a-941e-9890ef46273e-internal-tls-certs\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.458539 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c81aab-5f08-429a-941e-9890ef46273e-ovndb-tls-certs\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.459109 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2c81aab-5f08-429a-941e-9890ef46273e-config\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.459152 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c81aab-5f08-429a-941e-9890ef46273e-public-tls-certs\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.459362 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c81aab-5f08-429a-941e-9890ef46273e-combined-ca-bundle\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.459454 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2c81aab-5f08-429a-941e-9890ef46273e-httpd-config\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.459501 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqtvp\" (UniqueName: \"kubernetes.io/projected/c2c81aab-5f08-429a-941e-9890ef46273e-kube-api-access-wqtvp\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.561759 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c81aab-5f08-429a-941e-9890ef46273e-combined-ca-bundle\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.561824 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2c81aab-5f08-429a-941e-9890ef46273e-httpd-config\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.561847 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqtvp\" (UniqueName: \"kubernetes.io/projected/c2c81aab-5f08-429a-941e-9890ef46273e-kube-api-access-wqtvp\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.561904 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c81aab-5f08-429a-941e-9890ef46273e-internal-tls-certs\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.561945 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c81aab-5f08-429a-941e-9890ef46273e-ovndb-tls-certs\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.561982 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2c81aab-5f08-429a-941e-9890ef46273e-config\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.562000 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c81aab-5f08-429a-941e-9890ef46273e-public-tls-certs\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.567913 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c81aab-5f08-429a-941e-9890ef46273e-public-tls-certs\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.568037 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c81aab-5f08-429a-941e-9890ef46273e-combined-ca-bundle\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.568382 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c81aab-5f08-429a-941e-9890ef46273e-internal-tls-certs\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.573016 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2c81aab-5f08-429a-941e-9890ef46273e-config\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.574080 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c81aab-5f08-429a-941e-9890ef46273e-ovndb-tls-certs\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.581675 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2c81aab-5f08-429a-941e-9890ef46273e-httpd-config\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.585809 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqtvp\" (UniqueName: \"kubernetes.io/projected/c2c81aab-5f08-429a-941e-9890ef46273e-kube-api-access-wqtvp\") pod \"neutron-77d49c9649-fpwft\" (UID: \"c2c81aab-5f08-429a-941e-9890ef46273e\") " pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.619707 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.803695 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0c2dadd4-e719-4ec8-915e-683db6276f04","Type":"ContainerStarted","Data":"922ec2c7f93c103088ce3cd5934ffef4268f2ef2ff72a70109b5e4b95f7d879e"} Dec 04 10:05:11 crc kubenswrapper[4693]: I1204 10:05:11.898397 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-lsdbk"] Dec 04 10:05:11 crc kubenswrapper[4693]: W1204 10:05:11.902080 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c1e1a17_c252_42be_af6b_d6a3d0cacf8c.slice/crio-849236c820a9a1307492e7599e96d082b828cc85c5b38a7ed042754a0b8a5fc5 WatchSource:0}: Error finding container 849236c820a9a1307492e7599e96d082b828cc85c5b38a7ed042754a0b8a5fc5: Status 404 returned error can't find the container with id 849236c820a9a1307492e7599e96d082b828cc85c5b38a7ed042754a0b8a5fc5 Dec 04 10:05:12 crc kubenswrapper[4693]: I1204 10:05:12.156724 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77d49c9649-fpwft"] Dec 04 10:05:12 crc kubenswrapper[4693]: W1204 10:05:12.170785 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2c81aab_5f08_429a_941e_9890ef46273e.slice/crio-1966e33c6a5d703c8e62c1e47a2d1ac85feb3a1bbb9fc5881ffd7741800909b9 WatchSource:0}: Error finding container 1966e33c6a5d703c8e62c1e47a2d1ac85feb3a1bbb9fc5881ffd7741800909b9: Status 404 returned error can't find the container with id 1966e33c6a5d703c8e62c1e47a2d1ac85feb3a1bbb9fc5881ffd7741800909b9 Dec 04 10:05:12 crc kubenswrapper[4693]: I1204 10:05:12.815434 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77d49c9649-fpwft" event={"ID":"c2c81aab-5f08-429a-941e-9890ef46273e","Type":"ContainerStarted","Data":"1966e33c6a5d703c8e62c1e47a2d1ac85feb3a1bbb9fc5881ffd7741800909b9"} Dec 04 10:05:12 crc kubenswrapper[4693]: I1204 10:05:12.817274 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" event={"ID":"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c","Type":"ContainerStarted","Data":"849236c820a9a1307492e7599e96d082b828cc85c5b38a7ed042754a0b8a5fc5"} Dec 04 10:05:12 crc kubenswrapper[4693]: I1204 10:05:12.819933 4693 generic.go:334] "Generic (PLEG): container finished" podID="0279d22f-676e-45a1-b5d9-7dd7d8efc4b7" containerID="acd0140f961a2e191b22c4f5afdb948bf7b6a89137979415061fdf6f5e7abcdb" exitCode=143 Dec 04 10:05:12 crc kubenswrapper[4693]: I1204 10:05:12.819974 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7","Type":"ContainerDied","Data":"acd0140f961a2e191b22c4f5afdb948bf7b6a89137979415061fdf6f5e7abcdb"} Dec 04 10:05:14 crc kubenswrapper[4693]: I1204 10:05:14.365991 4693 generic.go:334] "Generic (PLEG): container finished" podID="0279d22f-676e-45a1-b5d9-7dd7d8efc4b7" containerID="1efa651f3ce77f023ada267a0cb6fe5262ec5461f91068402157c66635780e1f" exitCode=0 Dec 04 10:05:14 crc kubenswrapper[4693]: I1204 10:05:14.366103 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7","Type":"ContainerDied","Data":"1efa651f3ce77f023ada267a0cb6fe5262ec5461f91068402157c66635780e1f"} Dec 04 10:05:14 crc kubenswrapper[4693]: I1204 10:05:14.864032 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:05:14 crc kubenswrapper[4693]: I1204 10:05:14.864530 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:05:15 crc kubenswrapper[4693]: I1204 10:05:15.047917 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:05:15 crc kubenswrapper[4693]: I1204 10:05:15.047972 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:05:15 crc kubenswrapper[4693]: I1204 10:05:15.375279 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7ff87d85cb-w8vrw" event={"ID":"9928acda-0163-4fde-8635-f861c13c43fb","Type":"ContainerStarted","Data":"fa5f0f6a44aa5e360c05625ae7a266e108b3d5eb98aea7c3558e1f1e1c99d89e"} Dec 04 10:05:15 crc kubenswrapper[4693]: I1204 10:05:15.376751 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0c2dadd4-e719-4ec8-915e-683db6276f04","Type":"ContainerStarted","Data":"f29775d6004cfb3d5fe9f692378a399855b2c4276628128cb7b5fd2a2e2d9910"} Dec 04 10:05:15 crc kubenswrapper[4693]: I1204 10:05:15.378149 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" event={"ID":"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c","Type":"ContainerStarted","Data":"bcf30f7573081bc537a4fb717a7daba2a42b91c9352ca3c8498d7b1be9ff6f2b"} Dec 04 10:05:15 crc kubenswrapper[4693]: I1204 10:05:15.379693 4693 generic.go:334] "Generic (PLEG): container finished" podID="ca4b19dd-d54c-4531-aeca-8cb22716b387" containerID="4899be29e04db81fdf8613942d0a26341615fa4433dc5a05907485ee7e6becbf" exitCode=0 Dec 04 10:05:15 crc kubenswrapper[4693]: I1204 10:05:15.379746 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" event={"ID":"ca4b19dd-d54c-4531-aeca-8cb22716b387","Type":"ContainerDied","Data":"4899be29e04db81fdf8613942d0a26341615fa4433dc5a05907485ee7e6becbf"} Dec 04 10:05:15 crc kubenswrapper[4693]: I1204 10:05:15.381661 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77d49c9649-fpwft" event={"ID":"c2c81aab-5f08-429a-941e-9890ef46273e","Type":"ContainerStarted","Data":"e11edafffc095c1244fd4bc700e23415764654bc2bc3afcb5b5ff1d716a7545d"} Dec 04 10:05:16 crc kubenswrapper[4693]: I1204 10:05:16.201546 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" podUID="ca4b19dd-d54c-4531-aeca-8cb22716b387" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: connect: connection refused" Dec 04 10:05:16 crc kubenswrapper[4693]: I1204 10:05:16.394979 4693 generic.go:334] "Generic (PLEG): container finished" podID="9c1e1a17-c252-42be-af6b-d6a3d0cacf8c" containerID="bcf30f7573081bc537a4fb717a7daba2a42b91c9352ca3c8498d7b1be9ff6f2b" exitCode=0 Dec 04 10:05:16 crc kubenswrapper[4693]: I1204 10:05:16.395021 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" event={"ID":"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c","Type":"ContainerDied","Data":"bcf30f7573081bc537a4fb717a7daba2a42b91c9352ca3c8498d7b1be9ff6f2b"} Dec 04 10:05:16 crc kubenswrapper[4693]: I1204 10:05:16.819659 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:05:16 crc kubenswrapper[4693]: I1204 10:05:16.969885 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-config-data\") pod \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " Dec 04 10:05:16 crc kubenswrapper[4693]: I1204 10:05:16.970199 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-httpd-run\") pod \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " Dec 04 10:05:16 crc kubenswrapper[4693]: I1204 10:05:16.970248 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-ceph\") pod \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " Dec 04 10:05:16 crc kubenswrapper[4693]: I1204 10:05:16.970401 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-logs\") pod \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " Dec 04 10:05:16 crc kubenswrapper[4693]: I1204 10:05:16.970431 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " Dec 04 10:05:16 crc kubenswrapper[4693]: I1204 10:05:16.970446 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-scripts\") pod \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " Dec 04 10:05:16 crc kubenswrapper[4693]: I1204 10:05:16.970501 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5nfp\" (UniqueName: \"kubernetes.io/projected/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-kube-api-access-k5nfp\") pod \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " Dec 04 10:05:16 crc kubenswrapper[4693]: I1204 10:05:16.970562 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-combined-ca-bundle\") pod \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\" (UID: \"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7\") " Dec 04 10:05:16 crc kubenswrapper[4693]: I1204 10:05:16.970780 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0279d22f-676e-45a1-b5d9-7dd7d8efc4b7" (UID: "0279d22f-676e-45a1-b5d9-7dd7d8efc4b7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:05:16 crc kubenswrapper[4693]: I1204 10:05:16.970809 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-logs" (OuterVolumeSpecName: "logs") pod "0279d22f-676e-45a1-b5d9-7dd7d8efc4b7" (UID: "0279d22f-676e-45a1-b5d9-7dd7d8efc4b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:05:16 crc kubenswrapper[4693]: I1204 10:05:16.975822 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-ceph" (OuterVolumeSpecName: "ceph") pod "0279d22f-676e-45a1-b5d9-7dd7d8efc4b7" (UID: "0279d22f-676e-45a1-b5d9-7dd7d8efc4b7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:05:16 crc kubenswrapper[4693]: I1204 10:05:16.976367 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-kube-api-access-k5nfp" (OuterVolumeSpecName: "kube-api-access-k5nfp") pod "0279d22f-676e-45a1-b5d9-7dd7d8efc4b7" (UID: "0279d22f-676e-45a1-b5d9-7dd7d8efc4b7"). InnerVolumeSpecName "kube-api-access-k5nfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:05:16 crc kubenswrapper[4693]: I1204 10:05:16.976387 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "0279d22f-676e-45a1-b5d9-7dd7d8efc4b7" (UID: "0279d22f-676e-45a1-b5d9-7dd7d8efc4b7"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 10:05:16 crc kubenswrapper[4693]: I1204 10:05:16.976888 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-scripts" (OuterVolumeSpecName: "scripts") pod "0279d22f-676e-45a1-b5d9-7dd7d8efc4b7" (UID: "0279d22f-676e-45a1-b5d9-7dd7d8efc4b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:16 crc kubenswrapper[4693]: I1204 10:05:16.999392 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0279d22f-676e-45a1-b5d9-7dd7d8efc4b7" (UID: "0279d22f-676e-45a1-b5d9-7dd7d8efc4b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.022795 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-config-data" (OuterVolumeSpecName: "config-data") pod "0279d22f-676e-45a1-b5d9-7dd7d8efc4b7" (UID: "0279d22f-676e-45a1-b5d9-7dd7d8efc4b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.073281 4693 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.073320 4693 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.073349 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.073383 4693 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.073415 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.073430 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5nfp\" (UniqueName: \"kubernetes.io/projected/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-kube-api-access-k5nfp\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.073445 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.073457 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.098728 4693 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.174842 4693 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.416771 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0279d22f-676e-45a1-b5d9-7dd7d8efc4b7","Type":"ContainerDied","Data":"0b16648783dd4fd58ad0b01954e5b3d23334f51dde8f2a5fa5ff493e19bd1b6f"} Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.416820 4693 scope.go:117] "RemoveContainer" containerID="1efa651f3ce77f023ada267a0cb6fe5262ec5461f91068402157c66635780e1f" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.416844 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.461752 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.481127 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.494043 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:05:17 crc kubenswrapper[4693]: E1204 10:05:17.494489 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0279d22f-676e-45a1-b5d9-7dd7d8efc4b7" containerName="glance-httpd" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.494524 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0279d22f-676e-45a1-b5d9-7dd7d8efc4b7" containerName="glance-httpd" Dec 04 10:05:17 crc kubenswrapper[4693]: E1204 10:05:17.494537 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0279d22f-676e-45a1-b5d9-7dd7d8efc4b7" containerName="glance-log" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.494545 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0279d22f-676e-45a1-b5d9-7dd7d8efc4b7" containerName="glance-log" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.494758 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="0279d22f-676e-45a1-b5d9-7dd7d8efc4b7" containerName="glance-log" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.494781 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="0279d22f-676e-45a1-b5d9-7dd7d8efc4b7" containerName="glance-httpd" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.495782 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.503162 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.503449 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.518387 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.685961 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.686286 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb7f08d0-230c-4f83-b559-7cd16b4629ea-logs\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.686387 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-scripts\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.686719 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.686775 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lncd\" (UniqueName: \"kubernetes.io/projected/eb7f08d0-230c-4f83-b559-7cd16b4629ea-kube-api-access-6lncd\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.686801 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-config-data\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.686916 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eb7f08d0-230c-4f83-b559-7cd16b4629ea-ceph\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.686970 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb7f08d0-230c-4f83-b559-7cd16b4629ea-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.687072 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.788695 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.788747 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.788795 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb7f08d0-230c-4f83-b559-7cd16b4629ea-logs\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.788814 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-scripts\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.788897 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.789448 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb7f08d0-230c-4f83-b559-7cd16b4629ea-logs\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.789758 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.789887 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lncd\" (UniqueName: \"kubernetes.io/projected/eb7f08d0-230c-4f83-b559-7cd16b4629ea-kube-api-access-6lncd\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.789978 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-config-data\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.790174 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eb7f08d0-230c-4f83-b559-7cd16b4629ea-ceph\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.790284 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb7f08d0-230c-4f83-b559-7cd16b4629ea-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.790681 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb7f08d0-230c-4f83-b559-7cd16b4629ea-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.794600 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-scripts\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.795035 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.795087 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.795984 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eb7f08d0-230c-4f83-b559-7cd16b4629ea-ceph\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.798360 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-config-data\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.808838 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lncd\" (UniqueName: \"kubernetes.io/projected/eb7f08d0-230c-4f83-b559-7cd16b4629ea-kube-api-access-6lncd\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:17 crc kubenswrapper[4693]: I1204 10:05:17.830070 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " pod="openstack/glance-default-external-api-0" Dec 04 10:05:18 crc kubenswrapper[4693]: I1204 10:05:18.118228 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:05:18 crc kubenswrapper[4693]: I1204 10:05:18.154646 4693 scope.go:117] "RemoveContainer" containerID="acd0140f961a2e191b22c4f5afdb948bf7b6a89137979415061fdf6f5e7abcdb" Dec 04 10:05:18 crc kubenswrapper[4693]: I1204 10:05:18.470684 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0279d22f-676e-45a1-b5d9-7dd7d8efc4b7" path="/var/lib/kubelet/pods/0279d22f-676e-45a1-b5d9-7dd7d8efc4b7/volumes" Dec 04 10:05:19 crc kubenswrapper[4693]: I1204 10:05:19.093938 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:05:19 crc kubenswrapper[4693]: I1204 10:05:19.436107 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eb7f08d0-230c-4f83-b559-7cd16b4629ea","Type":"ContainerStarted","Data":"25e768c65269c732647a35196ac631ecb5a8c56bf94bd2ff253d434c5544ff08"} Dec 04 10:05:20 crc kubenswrapper[4693]: I1204 10:05:20.452680 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eb7f08d0-230c-4f83-b559-7cd16b4629ea","Type":"ContainerStarted","Data":"4392f0b7049337839504d250e6ebc64937035f7589a6309cf39ca4378181567c"} Dec 04 10:05:21 crc kubenswrapper[4693]: I1204 10:05:21.462430 4693 generic.go:334] "Generic (PLEG): container finished" podID="af7a03e1-cb13-4536-9405-791381101cdc" containerID="3b7577f95e018f83d92ce7dc59d26f9be7ad89e895504b1685de208e6d55b1c3" exitCode=0 Dec 04 10:05:21 crc kubenswrapper[4693]: I1204 10:05:21.462522 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-srzft" event={"ID":"af7a03e1-cb13-4536-9405-791381101cdc","Type":"ContainerDied","Data":"3b7577f95e018f83d92ce7dc59d26f9be7ad89e895504b1685de208e6d55b1c3"} Dec 04 10:05:23 crc kubenswrapper[4693]: I1204 10:05:23.482781 4693 generic.go:334] "Generic (PLEG): container finished" podID="00f3c238-cf53-4a99-96da-ae6118b711b4" containerID="cf6afee287e4f350cd3e182859a40e414bd245cc2f53f5a178eb6dca2920a8c6" exitCode=0 Dec 04 10:05:23 crc kubenswrapper[4693]: I1204 10:05:23.482967 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jw5j4" event={"ID":"00f3c238-cf53-4a99-96da-ae6118b711b4","Type":"ContainerDied","Data":"cf6afee287e4f350cd3e182859a40e414bd245cc2f53f5a178eb6dca2920a8c6"} Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.252428 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-srzft" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.254612 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af7a03e1-cb13-4536-9405-791381101cdc-logs\") pod \"af7a03e1-cb13-4536-9405-791381101cdc\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.254649 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n9qq\" (UniqueName: \"kubernetes.io/projected/af7a03e1-cb13-4536-9405-791381101cdc-kube-api-access-5n9qq\") pod \"af7a03e1-cb13-4536-9405-791381101cdc\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.254729 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af7a03e1-cb13-4536-9405-791381101cdc-combined-ca-bundle\") pod \"af7a03e1-cb13-4536-9405-791381101cdc\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.254790 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af7a03e1-cb13-4536-9405-791381101cdc-config-data\") pod \"af7a03e1-cb13-4536-9405-791381101cdc\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.254855 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af7a03e1-cb13-4536-9405-791381101cdc-scripts\") pod \"af7a03e1-cb13-4536-9405-791381101cdc\" (UID: \"af7a03e1-cb13-4536-9405-791381101cdc\") " Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.256050 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af7a03e1-cb13-4536-9405-791381101cdc-logs" (OuterVolumeSpecName: "logs") pod "af7a03e1-cb13-4536-9405-791381101cdc" (UID: "af7a03e1-cb13-4536-9405-791381101cdc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.263128 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.281426 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af7a03e1-cb13-4536-9405-791381101cdc-kube-api-access-5n9qq" (OuterVolumeSpecName: "kube-api-access-5n9qq") pod "af7a03e1-cb13-4536-9405-791381101cdc" (UID: "af7a03e1-cb13-4536-9405-791381101cdc"). InnerVolumeSpecName "kube-api-access-5n9qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.281555 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af7a03e1-cb13-4536-9405-791381101cdc-scripts" (OuterVolumeSpecName: "scripts") pod "af7a03e1-cb13-4536-9405-791381101cdc" (UID: "af7a03e1-cb13-4536-9405-791381101cdc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.313785 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af7a03e1-cb13-4536-9405-791381101cdc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af7a03e1-cb13-4536-9405-791381101cdc" (UID: "af7a03e1-cb13-4536-9405-791381101cdc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.330846 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af7a03e1-cb13-4536-9405-791381101cdc-config-data" (OuterVolumeSpecName: "config-data") pod "af7a03e1-cb13-4536-9405-791381101cdc" (UID: "af7a03e1-cb13-4536-9405-791381101cdc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.356777 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-dns-svc\") pod \"ca4b19dd-d54c-4531-aeca-8cb22716b387\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.356849 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-config\") pod \"ca4b19dd-d54c-4531-aeca-8cb22716b387\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.356882 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-ovsdbserver-nb\") pod \"ca4b19dd-d54c-4531-aeca-8cb22716b387\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.357003 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-dns-swift-storage-0\") pod \"ca4b19dd-d54c-4531-aeca-8cb22716b387\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.357127 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt84q\" (UniqueName: \"kubernetes.io/projected/ca4b19dd-d54c-4531-aeca-8cb22716b387-kube-api-access-jt84q\") pod \"ca4b19dd-d54c-4531-aeca-8cb22716b387\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.357281 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-ovsdbserver-sb\") pod \"ca4b19dd-d54c-4531-aeca-8cb22716b387\" (UID: \"ca4b19dd-d54c-4531-aeca-8cb22716b387\") " Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.365650 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca4b19dd-d54c-4531-aeca-8cb22716b387-kube-api-access-jt84q" (OuterVolumeSpecName: "kube-api-access-jt84q") pod "ca4b19dd-d54c-4531-aeca-8cb22716b387" (UID: "ca4b19dd-d54c-4531-aeca-8cb22716b387"). InnerVolumeSpecName "kube-api-access-jt84q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.373964 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af7a03e1-cb13-4536-9405-791381101cdc-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.374275 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af7a03e1-cb13-4536-9405-791381101cdc-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.374293 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n9qq\" (UniqueName: \"kubernetes.io/projected/af7a03e1-cb13-4536-9405-791381101cdc-kube-api-access-5n9qq\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.374317 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af7a03e1-cb13-4536-9405-791381101cdc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.374343 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af7a03e1-cb13-4536-9405-791381101cdc-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.401602 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ca4b19dd-d54c-4531-aeca-8cb22716b387" (UID: "ca4b19dd-d54c-4531-aeca-8cb22716b387"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.438895 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ca4b19dd-d54c-4531-aeca-8cb22716b387" (UID: "ca4b19dd-d54c-4531-aeca-8cb22716b387"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.446045 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ca4b19dd-d54c-4531-aeca-8cb22716b387" (UID: "ca4b19dd-d54c-4531-aeca-8cb22716b387"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.447066 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ca4b19dd-d54c-4531-aeca-8cb22716b387" (UID: "ca4b19dd-d54c-4531-aeca-8cb22716b387"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.449074 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-config" (OuterVolumeSpecName: "config") pod "ca4b19dd-d54c-4531-aeca-8cb22716b387" (UID: "ca4b19dd-d54c-4531-aeca-8cb22716b387"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.476753 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.476781 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.476790 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.476798 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.476808 4693 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca4b19dd-d54c-4531-aeca-8cb22716b387-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.476817 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt84q\" (UniqueName: \"kubernetes.io/projected/ca4b19dd-d54c-4531-aeca-8cb22716b387-kube-api-access-jt84q\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.503911 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" event={"ID":"ca4b19dd-d54c-4531-aeca-8cb22716b387","Type":"ContainerDied","Data":"a9fe8f6da4de41e73d3497d7dfe5beb8e237137e5be6b9b1fb7cf2cc7688907a"} Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.504314 4693 scope.go:117] "RemoveContainer" containerID="4899be29e04db81fdf8613942d0a26341615fa4433dc5a05907485ee7e6becbf" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.503956 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.509603 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-srzft" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.509612 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-srzft" event={"ID":"af7a03e1-cb13-4536-9405-791381101cdc","Type":"ContainerDied","Data":"5d42c355992d6deb77facf0f2e868daece9b78cd6c1c706a85c4823fa0b63cc1"} Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.509648 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d42c355992d6deb77facf0f2e868daece9b78cd6c1c706a85c4823fa0b63cc1" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.558693 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-9t76h"] Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.564020 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-9t76h"] Dec 04 10:05:24 crc kubenswrapper[4693]: E1204 10:05:24.643436 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf7a03e1_cb13_4536_9405_791381101cdc.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf7a03e1_cb13_4536_9405_791381101cdc.slice/crio-5d42c355992d6deb77facf0f2e868daece9b78cd6c1c706a85c4823fa0b63cc1\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca4b19dd_d54c_4531_aeca_8cb22716b387.slice\": RecentStats: unable to find data in memory cache]" Dec 04 10:05:24 crc kubenswrapper[4693]: I1204 10:05:24.864511 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7649787bc6-fddzc" podUID="5d096d1f-bf52-413c-9cb9-4c89179e5725" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.048510 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6f8cd9d6cb-vf5bx" podUID="ca2592b0-5dfd-4d15-996c-2340af86bd26" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.388475 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-596dc75986-wjgrk"] Dec 04 10:05:25 crc kubenswrapper[4693]: E1204 10:05:25.388968 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4b19dd-d54c-4531-aeca-8cb22716b387" containerName="dnsmasq-dns" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.388984 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4b19dd-d54c-4531-aeca-8cb22716b387" containerName="dnsmasq-dns" Dec 04 10:05:25 crc kubenswrapper[4693]: E1204 10:05:25.388998 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af7a03e1-cb13-4536-9405-791381101cdc" containerName="placement-db-sync" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.389006 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="af7a03e1-cb13-4536-9405-791381101cdc" containerName="placement-db-sync" Dec 04 10:05:25 crc kubenswrapper[4693]: E1204 10:05:25.389018 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4b19dd-d54c-4531-aeca-8cb22716b387" containerName="init" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.389024 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4b19dd-d54c-4531-aeca-8cb22716b387" containerName="init" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.389238 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="af7a03e1-cb13-4536-9405-791381101cdc" containerName="placement-db-sync" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.389255 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca4b19dd-d54c-4531-aeca-8cb22716b387" containerName="dnsmasq-dns" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.390448 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.393440 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.393695 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.393738 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.395551 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.399849 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-k954f" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.413473 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-596dc75986-wjgrk"] Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.492054 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e6c8844-46bb-47e2-99d2-a9da861757e7-combined-ca-bundle\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.492114 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e6c8844-46bb-47e2-99d2-a9da861757e7-logs\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.492254 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc7t7\" (UniqueName: \"kubernetes.io/projected/7e6c8844-46bb-47e2-99d2-a9da861757e7-kube-api-access-cc7t7\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.492284 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e6c8844-46bb-47e2-99d2-a9da861757e7-internal-tls-certs\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.492503 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e6c8844-46bb-47e2-99d2-a9da861757e7-public-tls-certs\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.492588 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e6c8844-46bb-47e2-99d2-a9da861757e7-scripts\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.492687 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e6c8844-46bb-47e2-99d2-a9da861757e7-config-data\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.594544 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e6c8844-46bb-47e2-99d2-a9da861757e7-public-tls-certs\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.594613 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e6c8844-46bb-47e2-99d2-a9da861757e7-scripts\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.594669 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e6c8844-46bb-47e2-99d2-a9da861757e7-config-data\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.594746 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e6c8844-46bb-47e2-99d2-a9da861757e7-combined-ca-bundle\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.594773 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e6c8844-46bb-47e2-99d2-a9da861757e7-logs\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.595408 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e6c8844-46bb-47e2-99d2-a9da861757e7-logs\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.596077 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc7t7\" (UniqueName: \"kubernetes.io/projected/7e6c8844-46bb-47e2-99d2-a9da861757e7-kube-api-access-cc7t7\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.596129 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e6c8844-46bb-47e2-99d2-a9da861757e7-internal-tls-certs\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.601409 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e6c8844-46bb-47e2-99d2-a9da861757e7-combined-ca-bundle\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.601449 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e6c8844-46bb-47e2-99d2-a9da861757e7-config-data\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.601750 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e6c8844-46bb-47e2-99d2-a9da861757e7-public-tls-certs\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.603419 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e6c8844-46bb-47e2-99d2-a9da861757e7-scripts\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.604985 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e6c8844-46bb-47e2-99d2-a9da861757e7-internal-tls-certs\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.620708 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc7t7\" (UniqueName: \"kubernetes.io/projected/7e6c8844-46bb-47e2-99d2-a9da861757e7-kube-api-access-cc7t7\") pod \"placement-596dc75986-wjgrk\" (UID: \"7e6c8844-46bb-47e2-99d2-a9da861757e7\") " pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:25 crc kubenswrapper[4693]: I1204 10:05:25.705003 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.198845 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-9t76h" podUID="ca4b19dd-d54c-4531-aeca-8cb22716b387" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: i/o timeout" Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.471178 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca4b19dd-d54c-4531-aeca-8cb22716b387" path="/var/lib/kubelet/pods/ca4b19dd-d54c-4531-aeca-8cb22716b387/volumes" Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.628687 4693 scope.go:117] "RemoveContainer" containerID="4cf63db4672c83617b4e662fac1194a6080551badb627c3941b766b561f0714e" Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.681744 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.713976 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-combined-ca-bundle\") pod \"00f3c238-cf53-4a99-96da-ae6118b711b4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.714034 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-scripts\") pod \"00f3c238-cf53-4a99-96da-ae6118b711b4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.714109 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-credential-keys\") pod \"00f3c238-cf53-4a99-96da-ae6118b711b4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.714158 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-fernet-keys\") pod \"00f3c238-cf53-4a99-96da-ae6118b711b4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.714263 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-config-data\") pod \"00f3c238-cf53-4a99-96da-ae6118b711b4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.714394 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mbzx\" (UniqueName: \"kubernetes.io/projected/00f3c238-cf53-4a99-96da-ae6118b711b4-kube-api-access-4mbzx\") pod \"00f3c238-cf53-4a99-96da-ae6118b711b4\" (UID: \"00f3c238-cf53-4a99-96da-ae6118b711b4\") " Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.724699 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "00f3c238-cf53-4a99-96da-ae6118b711b4" (UID: "00f3c238-cf53-4a99-96da-ae6118b711b4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.732429 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-scripts" (OuterVolumeSpecName: "scripts") pod "00f3c238-cf53-4a99-96da-ae6118b711b4" (UID: "00f3c238-cf53-4a99-96da-ae6118b711b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.732627 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00f3c238-cf53-4a99-96da-ae6118b711b4-kube-api-access-4mbzx" (OuterVolumeSpecName: "kube-api-access-4mbzx") pod "00f3c238-cf53-4a99-96da-ae6118b711b4" (UID: "00f3c238-cf53-4a99-96da-ae6118b711b4"). InnerVolumeSpecName "kube-api-access-4mbzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.733193 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "00f3c238-cf53-4a99-96da-ae6118b711b4" (UID: "00f3c238-cf53-4a99-96da-ae6118b711b4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.785414 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00f3c238-cf53-4a99-96da-ae6118b711b4" (UID: "00f3c238-cf53-4a99-96da-ae6118b711b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.803907 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-config-data" (OuterVolumeSpecName: "config-data") pod "00f3c238-cf53-4a99-96da-ae6118b711b4" (UID: "00f3c238-cf53-4a99-96da-ae6118b711b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.816498 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mbzx\" (UniqueName: \"kubernetes.io/projected/00f3c238-cf53-4a99-96da-ae6118b711b4-kube-api-access-4mbzx\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.816531 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.816543 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.816552 4693 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.816565 4693 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:26 crc kubenswrapper[4693]: I1204 10:05:26.816573 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00f3c238-cf53-4a99-96da-ae6118b711b4-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:27 crc kubenswrapper[4693]: I1204 10:05:27.549062 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0c2dadd4-e719-4ec8-915e-683db6276f04","Type":"ContainerStarted","Data":"ac3055f045b3812d37fde41974980bdadf2b3b77f8ee94f92d3619e2149fcaf0"} Dec 04 10:05:27 crc kubenswrapper[4693]: I1204 10:05:27.551882 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jw5j4" event={"ID":"00f3c238-cf53-4a99-96da-ae6118b711b4","Type":"ContainerDied","Data":"ea6fd7e226a139509f416f5e5033c4bf3741503588dfff26fda431cb2531c605"} Dec 04 10:05:27 crc kubenswrapper[4693]: I1204 10:05:27.551926 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea6fd7e226a139509f416f5e5033c4bf3741503588dfff26fda431cb2531c605" Dec 04 10:05:27 crc kubenswrapper[4693]: I1204 10:05:27.551964 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jw5j4" Dec 04 10:05:27 crc kubenswrapper[4693]: I1204 10:05:27.592404 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=17.59238487 podStartE2EDuration="17.59238487s" podCreationTimestamp="2025-12-04 10:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:05:27.580985465 +0000 UTC m=+1373.478579218" watchObservedRunningTime="2025-12-04 10:05:27.59238487 +0000 UTC m=+1373.489978623" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.178836 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7f774bdc67-hjxts"] Dec 04 10:05:30 crc kubenswrapper[4693]: E1204 10:05:30.179915 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00f3c238-cf53-4a99-96da-ae6118b711b4" containerName="keystone-bootstrap" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.179942 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="00f3c238-cf53-4a99-96da-ae6118b711b4" containerName="keystone-bootstrap" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.180229 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="00f3c238-cf53-4a99-96da-ae6118b711b4" containerName="keystone-bootstrap" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.181185 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.184848 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.184991 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.185419 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9fq9s" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.185620 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.185658 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.187207 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.191542 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-public-tls-certs\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.191601 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7f774bdc67-hjxts"] Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.191707 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-config-data\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.191740 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-internal-tls-certs\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.191770 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mn2f\" (UniqueName: \"kubernetes.io/projected/fc3b6747-ed65-46ef-8034-e35edf80ac90-kube-api-access-7mn2f\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.191875 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-combined-ca-bundle\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.191908 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-credential-keys\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.191944 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-fernet-keys\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.192157 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-scripts\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.293550 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-config-data\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.293593 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-internal-tls-certs\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.293616 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mn2f\" (UniqueName: \"kubernetes.io/projected/fc3b6747-ed65-46ef-8034-e35edf80ac90-kube-api-access-7mn2f\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.293641 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-combined-ca-bundle\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.293673 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-credential-keys\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.293742 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-fernet-keys\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.294562 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-scripts\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.294919 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-public-tls-certs\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.300467 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-public-tls-certs\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.300986 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-internal-tls-certs\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.301327 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-combined-ca-bundle\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.302000 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-fernet-keys\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.303602 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-config-data\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.312697 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-scripts\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.312962 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fc3b6747-ed65-46ef-8034-e35edf80ac90-credential-keys\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.323807 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mn2f\" (UniqueName: \"kubernetes.io/projected/fc3b6747-ed65-46ef-8034-e35edf80ac90-kube-api-access-7mn2f\") pod \"keystone-7f774bdc67-hjxts\" (UID: \"fc3b6747-ed65-46ef-8034-e35edf80ac90\") " pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.497833 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.497880 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.533459 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.536702 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.557681 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.607000 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 10:05:30 crc kubenswrapper[4693]: I1204 10:05:30.607048 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 10:05:31 crc kubenswrapper[4693]: I1204 10:05:31.616821 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77d49c9649-fpwft" event={"ID":"c2c81aab-5f08-429a-941e-9890ef46273e","Type":"ContainerStarted","Data":"0349683c1cfd9594d370b9706ec3e00ac358917827df135de127cdf082bda185"} Dec 04 10:05:31 crc kubenswrapper[4693]: I1204 10:05:31.618220 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:31 crc kubenswrapper[4693]: I1204 10:05:31.620793 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7ff87d85cb-w8vrw" event={"ID":"9928acda-0163-4fde-8635-f861c13c43fb","Type":"ContainerStarted","Data":"f88f292625dc3426b9568c228f1164544454fdea7d1df1e22de7eac21782eed5"} Dec 04 10:05:31 crc kubenswrapper[4693]: I1204 10:05:31.621172 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:31 crc kubenswrapper[4693]: I1204 10:05:31.646367 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-77d49c9649-fpwft" podStartSLOduration=20.646346846 podStartE2EDuration="20.646346846s" podCreationTimestamp="2025-12-04 10:05:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:05:31.63960026 +0000 UTC m=+1377.537194023" watchObservedRunningTime="2025-12-04 10:05:31.646346846 +0000 UTC m=+1377.543940619" Dec 04 10:05:31 crc kubenswrapper[4693]: I1204 10:05:31.662713 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7ff87d85cb-w8vrw" podStartSLOduration=22.662691957 podStartE2EDuration="22.662691957s" podCreationTimestamp="2025-12-04 10:05:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:05:31.660807825 +0000 UTC m=+1377.558401578" watchObservedRunningTime="2025-12-04 10:05:31.662691957 +0000 UTC m=+1377.560285710" Dec 04 10:05:32 crc kubenswrapper[4693]: I1204 10:05:32.669184 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 10:05:34 crc kubenswrapper[4693]: I1204 10:05:34.864184 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7649787bc6-fddzc" podUID="5d096d1f-bf52-413c-9cb9-4c89179e5725" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 04 10:05:36 crc kubenswrapper[4693]: I1204 10:05:36.934940 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:05:38 crc kubenswrapper[4693]: I1204 10:05:38.690246 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6f8cd9d6cb-vf5bx" Dec 04 10:05:38 crc kubenswrapper[4693]: I1204 10:05:38.759665 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7649787bc6-fddzc"] Dec 04 10:05:38 crc kubenswrapper[4693]: I1204 10:05:38.760211 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7649787bc6-fddzc" podUID="5d096d1f-bf52-413c-9cb9-4c89179e5725" containerName="horizon-log" containerID="cri-o://e98854c320adffc66ab9b3f27212c1efb5e2d439a4194f947da8e97aa18a8e59" gracePeriod=30 Dec 04 10:05:38 crc kubenswrapper[4693]: I1204 10:05:38.760257 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7649787bc6-fddzc" podUID="5d096d1f-bf52-413c-9cb9-4c89179e5725" containerName="horizon" containerID="cri-o://f1cd933525400a784d0d5d17a18b7c228b22829e893d0fb79dbba4b73a786655" gracePeriod=30 Dec 04 10:05:39 crc kubenswrapper[4693]: I1204 10:05:39.504047 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-7ff87d85cb-w8vrw" podUID="9928acda-0163-4fde-8635-f861c13c43fb" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 04 10:05:39 crc kubenswrapper[4693]: I1204 10:05:39.504080 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-7ff87d85cb-w8vrw" podUID="9928acda-0163-4fde-8635-f861c13c43fb" containerName="neutron-api" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 04 10:05:39 crc kubenswrapper[4693]: I1204 10:05:39.504539 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7ff87d85cb-w8vrw" podUID="9928acda-0163-4fde-8635-f861c13c43fb" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 04 10:05:41 crc kubenswrapper[4693]: I1204 10:05:41.491045 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-596dc75986-wjgrk"] Dec 04 10:05:41 crc kubenswrapper[4693]: I1204 10:05:41.595897 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7f774bdc67-hjxts"] Dec 04 10:05:41 crc kubenswrapper[4693]: I1204 10:05:41.733618 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-596dc75986-wjgrk" event={"ID":"7e6c8844-46bb-47e2-99d2-a9da861757e7","Type":"ContainerStarted","Data":"7ea0556add8a9e74ef77da94ccfdc8e893c512ec5fe38449c7f17d06bedaf431"} Dec 04 10:05:41 crc kubenswrapper[4693]: I1204 10:05:41.748180 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f774bdc67-hjxts" event={"ID":"fc3b6747-ed65-46ef-8034-e35edf80ac90","Type":"ContainerStarted","Data":"9d0e079cd2fee725404c23b2b1713d1f283b66d53ee441b611538771ec4af182"} Dec 04 10:05:41 crc kubenswrapper[4693]: I1204 10:05:41.751049 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eb7f08d0-230c-4f83-b559-7cd16b4629ea","Type":"ContainerStarted","Data":"4c5e19c3b9d3c1e1b8b40d70d16b050682e30ec966883be7ebb5bf627f58c55b"} Dec 04 10:05:41 crc kubenswrapper[4693]: I1204 10:05:41.777506 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edf292dd-e2f4-4e80-ab9a-3e548118489a","Type":"ContainerStarted","Data":"03b0ae154874e27f4fd270f8d73e693e94e8793a55b2936871d9df8e22d541fb"} Dec 04 10:05:41 crc kubenswrapper[4693]: I1204 10:05:41.785449 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" event={"ID":"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c","Type":"ContainerStarted","Data":"c911e4ffb75a2ebfcbdfc785b7a48cc5605208fd6d0a663239513b8b5cb58d89"} Dec 04 10:05:41 crc kubenswrapper[4693]: I1204 10:05:41.787182 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:41 crc kubenswrapper[4693]: I1204 10:05:41.792558 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=24.792532533 podStartE2EDuration="24.792532533s" podCreationTimestamp="2025-12-04 10:05:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:05:41.785156729 +0000 UTC m=+1387.682750482" watchObservedRunningTime="2025-12-04 10:05:41.792532533 +0000 UTC m=+1387.690126286" Dec 04 10:05:41 crc kubenswrapper[4693]: I1204 10:05:41.805361 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-p69cr" event={"ID":"a62cb864-103c-4b89-afeb-8397af4046cb","Type":"ContainerStarted","Data":"714f5dd6229837b0c880729a145472584d9396fcefc23b643b9687974c4c9ffe"} Dec 04 10:05:41 crc kubenswrapper[4693]: I1204 10:05:41.817358 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" podStartSLOduration=33.817341008 podStartE2EDuration="33.817341008s" podCreationTimestamp="2025-12-04 10:05:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:05:41.811006273 +0000 UTC m=+1387.708600016" watchObservedRunningTime="2025-12-04 10:05:41.817341008 +0000 UTC m=+1387.714934751" Dec 04 10:05:41 crc kubenswrapper[4693]: I1204 10:05:41.828303 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-p69cr" podStartSLOduration=3.202463831 podStartE2EDuration="1m16.82828386s" podCreationTimestamp="2025-12-04 10:04:25 +0000 UTC" firstStartedPulling="2025-12-04 10:04:27.345552399 +0000 UTC m=+1313.243146152" lastFinishedPulling="2025-12-04 10:05:40.971372428 +0000 UTC m=+1386.868966181" observedRunningTime="2025-12-04 10:05:41.826298645 +0000 UTC m=+1387.723892398" watchObservedRunningTime="2025-12-04 10:05:41.82828386 +0000 UTC m=+1387.725877613" Dec 04 10:05:41 crc kubenswrapper[4693]: I1204 10:05:41.879717 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 10:05:42 crc kubenswrapper[4693]: I1204 10:05:42.184976 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-77d49c9649-fpwft" Dec 04 10:05:42 crc kubenswrapper[4693]: I1204 10:05:42.289848 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7ff87d85cb-w8vrw"] Dec 04 10:05:42 crc kubenswrapper[4693]: I1204 10:05:42.290140 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7ff87d85cb-w8vrw" podUID="9928acda-0163-4fde-8635-f861c13c43fb" containerName="neutron-api" containerID="cri-o://fa5f0f6a44aa5e360c05625ae7a266e108b3d5eb98aea7c3558e1f1e1c99d89e" gracePeriod=30 Dec 04 10:05:42 crc kubenswrapper[4693]: I1204 10:05:42.291265 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7ff87d85cb-w8vrw" podUID="9928acda-0163-4fde-8635-f861c13c43fb" containerName="neutron-httpd" containerID="cri-o://f88f292625dc3426b9568c228f1164544454fdea7d1df1e22de7eac21782eed5" gracePeriod=30 Dec 04 10:05:42 crc kubenswrapper[4693]: I1204 10:05:42.317438 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7ff87d85cb-w8vrw" podUID="9928acda-0163-4fde-8635-f861c13c43fb" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.0.153:9696/\": EOF" Dec 04 10:05:42 crc kubenswrapper[4693]: I1204 10:05:42.817533 4693 generic.go:334] "Generic (PLEG): container finished" podID="9928acda-0163-4fde-8635-f861c13c43fb" containerID="f88f292625dc3426b9568c228f1164544454fdea7d1df1e22de7eac21782eed5" exitCode=0 Dec 04 10:05:42 crc kubenswrapper[4693]: I1204 10:05:42.817591 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7ff87d85cb-w8vrw" event={"ID":"9928acda-0163-4fde-8635-f861c13c43fb","Type":"ContainerDied","Data":"f88f292625dc3426b9568c228f1164544454fdea7d1df1e22de7eac21782eed5"} Dec 04 10:05:42 crc kubenswrapper[4693]: I1204 10:05:42.820916 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-596dc75986-wjgrk" event={"ID":"7e6c8844-46bb-47e2-99d2-a9da861757e7","Type":"ContainerStarted","Data":"3d27c8e44ae6a4f03c547d0312db25344055c68838d1eb00a129687477e40ba7"} Dec 04 10:05:42 crc kubenswrapper[4693]: I1204 10:05:42.820967 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-596dc75986-wjgrk" event={"ID":"7e6c8844-46bb-47e2-99d2-a9da861757e7","Type":"ContainerStarted","Data":"0051c55128b10868591616664345ae5e4567f3995b260ed14c2a865f04deb075"} Dec 04 10:05:42 crc kubenswrapper[4693]: I1204 10:05:42.821103 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:42 crc kubenswrapper[4693]: I1204 10:05:42.822822 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f774bdc67-hjxts" event={"ID":"fc3b6747-ed65-46ef-8034-e35edf80ac90","Type":"ContainerStarted","Data":"ea6c20ea2bae7bfe1edd0bcf0744656a92e522ec62e3faebfdf5775b33bfeaf2"} Dec 04 10:05:42 crc kubenswrapper[4693]: I1204 10:05:42.822939 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:05:42 crc kubenswrapper[4693]: I1204 10:05:42.825094 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-q642m" event={"ID":"832e603c-b695-442e-bcf6-fa322cfc1524","Type":"ContainerStarted","Data":"e1856c4bcc5f7d4dd767ab5d0ee7c26159e7fce3876db52d046b3e1b06a476cf"} Dec 04 10:05:42 crc kubenswrapper[4693]: I1204 10:05:42.855711 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-596dc75986-wjgrk" podStartSLOduration=17.855690961 podStartE2EDuration="17.855690961s" podCreationTimestamp="2025-12-04 10:05:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:05:42.84661632 +0000 UTC m=+1388.744210073" watchObservedRunningTime="2025-12-04 10:05:42.855690961 +0000 UTC m=+1388.753284704" Dec 04 10:05:42 crc kubenswrapper[4693]: I1204 10:05:42.874715 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7f774bdc67-hjxts" podStartSLOduration=12.874680245 podStartE2EDuration="12.874680245s" podCreationTimestamp="2025-12-04 10:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:05:42.866830179 +0000 UTC m=+1388.764423932" watchObservedRunningTime="2025-12-04 10:05:42.874680245 +0000 UTC m=+1388.772274018" Dec 04 10:05:42 crc kubenswrapper[4693]: I1204 10:05:42.897023 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-q642m" podStartSLOduration=4.020270997 podStartE2EDuration="1m17.896997612s" podCreationTimestamp="2025-12-04 10:04:25 +0000 UTC" firstStartedPulling="2025-12-04 10:04:27.105497613 +0000 UTC m=+1313.003091366" lastFinishedPulling="2025-12-04 10:05:40.982224228 +0000 UTC m=+1386.879817981" observedRunningTime="2025-12-04 10:05:42.890133782 +0000 UTC m=+1388.787727535" watchObservedRunningTime="2025-12-04 10:05:42.896997612 +0000 UTC m=+1388.794591365" Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.668687 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.704838 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-ovndb-tls-certs\") pod \"9928acda-0163-4fde-8635-f861c13c43fb\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.705078 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-config\") pod \"9928acda-0163-4fde-8635-f861c13c43fb\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.705102 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-combined-ca-bundle\") pod \"9928acda-0163-4fde-8635-f861c13c43fb\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.705140 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9smvv\" (UniqueName: \"kubernetes.io/projected/9928acda-0163-4fde-8635-f861c13c43fb-kube-api-access-9smvv\") pod \"9928acda-0163-4fde-8635-f861c13c43fb\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.705202 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-httpd-config\") pod \"9928acda-0163-4fde-8635-f861c13c43fb\" (UID: \"9928acda-0163-4fde-8635-f861c13c43fb\") " Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.728614 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9928acda-0163-4fde-8635-f861c13c43fb-kube-api-access-9smvv" (OuterVolumeSpecName: "kube-api-access-9smvv") pod "9928acda-0163-4fde-8635-f861c13c43fb" (UID: "9928acda-0163-4fde-8635-f861c13c43fb"). InnerVolumeSpecName "kube-api-access-9smvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.734072 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "9928acda-0163-4fde-8635-f861c13c43fb" (UID: "9928acda-0163-4fde-8635-f861c13c43fb"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.779490 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-config" (OuterVolumeSpecName: "config") pod "9928acda-0163-4fde-8635-f861c13c43fb" (UID: "9928acda-0163-4fde-8635-f861c13c43fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.796191 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9928acda-0163-4fde-8635-f861c13c43fb" (UID: "9928acda-0163-4fde-8635-f861c13c43fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.808089 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.808122 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.808134 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9smvv\" (UniqueName: \"kubernetes.io/projected/9928acda-0163-4fde-8635-f861c13c43fb-kube-api-access-9smvv\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.808144 4693 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.830049 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "9928acda-0163-4fde-8635-f861c13c43fb" (UID: "9928acda-0163-4fde-8635-f861c13c43fb"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.846928 4693 generic.go:334] "Generic (PLEG): container finished" podID="9928acda-0163-4fde-8635-f861c13c43fb" containerID="fa5f0f6a44aa5e360c05625ae7a266e108b3d5eb98aea7c3558e1f1e1c99d89e" exitCode=0 Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.847391 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7ff87d85cb-w8vrw" event={"ID":"9928acda-0163-4fde-8635-f861c13c43fb","Type":"ContainerDied","Data":"fa5f0f6a44aa5e360c05625ae7a266e108b3d5eb98aea7c3558e1f1e1c99d89e"} Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.847438 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7ff87d85cb-w8vrw" event={"ID":"9928acda-0163-4fde-8635-f861c13c43fb","Type":"ContainerDied","Data":"17baf77493703ae7a1fdb96be17fac4d139015d4f5b986ff8faa81b8bda7b5d6"} Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.847459 4693 scope.go:117] "RemoveContainer" containerID="f88f292625dc3426b9568c228f1164544454fdea7d1df1e22de7eac21782eed5" Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.847515 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7ff87d85cb-w8vrw" Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.848991 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.894605 4693 scope.go:117] "RemoveContainer" containerID="fa5f0f6a44aa5e360c05625ae7a266e108b3d5eb98aea7c3558e1f1e1c99d89e" Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.895059 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7ff87d85cb-w8vrw"] Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.910138 4693 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9928acda-0163-4fde-8635-f861c13c43fb-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.914652 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7ff87d85cb-w8vrw"] Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.922107 4693 scope.go:117] "RemoveContainer" containerID="f88f292625dc3426b9568c228f1164544454fdea7d1df1e22de7eac21782eed5" Dec 04 10:05:43 crc kubenswrapper[4693]: E1204 10:05:43.923078 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f88f292625dc3426b9568c228f1164544454fdea7d1df1e22de7eac21782eed5\": container with ID starting with f88f292625dc3426b9568c228f1164544454fdea7d1df1e22de7eac21782eed5 not found: ID does not exist" containerID="f88f292625dc3426b9568c228f1164544454fdea7d1df1e22de7eac21782eed5" Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.923125 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f88f292625dc3426b9568c228f1164544454fdea7d1df1e22de7eac21782eed5"} err="failed to get container status \"f88f292625dc3426b9568c228f1164544454fdea7d1df1e22de7eac21782eed5\": rpc error: code = NotFound desc = could not find container \"f88f292625dc3426b9568c228f1164544454fdea7d1df1e22de7eac21782eed5\": container with ID starting with f88f292625dc3426b9568c228f1164544454fdea7d1df1e22de7eac21782eed5 not found: ID does not exist" Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.923157 4693 scope.go:117] "RemoveContainer" containerID="fa5f0f6a44aa5e360c05625ae7a266e108b3d5eb98aea7c3558e1f1e1c99d89e" Dec 04 10:05:43 crc kubenswrapper[4693]: E1204 10:05:43.923650 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa5f0f6a44aa5e360c05625ae7a266e108b3d5eb98aea7c3558e1f1e1c99d89e\": container with ID starting with fa5f0f6a44aa5e360c05625ae7a266e108b3d5eb98aea7c3558e1f1e1c99d89e not found: ID does not exist" containerID="fa5f0f6a44aa5e360c05625ae7a266e108b3d5eb98aea7c3558e1f1e1c99d89e" Dec 04 10:05:43 crc kubenswrapper[4693]: I1204 10:05:43.923673 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa5f0f6a44aa5e360c05625ae7a266e108b3d5eb98aea7c3558e1f1e1c99d89e"} err="failed to get container status \"fa5f0f6a44aa5e360c05625ae7a266e108b3d5eb98aea7c3558e1f1e1c99d89e\": rpc error: code = NotFound desc = could not find container \"fa5f0f6a44aa5e360c05625ae7a266e108b3d5eb98aea7c3558e1f1e1c99d89e\": container with ID starting with fa5f0f6a44aa5e360c05625ae7a266e108b3d5eb98aea7c3558e1f1e1c99d89e not found: ID does not exist" Dec 04 10:05:44 crc kubenswrapper[4693]: I1204 10:05:44.472477 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9928acda-0163-4fde-8635-f861c13c43fb" path="/var/lib/kubelet/pods/9928acda-0163-4fde-8635-f861c13c43fb/volumes" Dec 04 10:05:48 crc kubenswrapper[4693]: E1204 10:05:48.152597 4693 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.692s" Dec 04 10:05:48 crc kubenswrapper[4693]: I1204 10:05:48.152987 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 10:05:48 crc kubenswrapper[4693]: I1204 10:05:48.153009 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 10:05:48 crc kubenswrapper[4693]: I1204 10:05:48.153020 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 10:05:48 crc kubenswrapper[4693]: I1204 10:05:48.153110 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 10:05:48 crc kubenswrapper[4693]: I1204 10:05:48.153122 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 10:05:48 crc kubenswrapper[4693]: I1204 10:05:48.201435 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 10:05:49 crc kubenswrapper[4693]: I1204 10:05:49.478478 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:05:49 crc kubenswrapper[4693]: I1204 10:05:49.534981 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-mlf2h"] Dec 04 10:05:49 crc kubenswrapper[4693]: I1204 10:05:49.535212 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" podUID="d7a457be-015b-44a1-b55c-d0254008b53f" containerName="dnsmasq-dns" containerID="cri-o://e5dedb9ba7c84c0ecc079aba1c170dec64e90bed927ebba1c4091fe857eafe51" gracePeriod=10 Dec 04 10:05:51 crc kubenswrapper[4693]: I1204 10:05:51.898643 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" podUID="d7a457be-015b-44a1-b55c-d0254008b53f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.144:5353: connect: connection refused" Dec 04 10:05:51 crc kubenswrapper[4693]: I1204 10:05:51.939126 4693 generic.go:334] "Generic (PLEG): container finished" podID="d7a457be-015b-44a1-b55c-d0254008b53f" containerID="e5dedb9ba7c84c0ecc079aba1c170dec64e90bed927ebba1c4091fe857eafe51" exitCode=0 Dec 04 10:05:51 crc kubenswrapper[4693]: I1204 10:05:51.939182 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" event={"ID":"d7a457be-015b-44a1-b55c-d0254008b53f","Type":"ContainerDied","Data":"e5dedb9ba7c84c0ecc079aba1c170dec64e90bed927ebba1c4091fe857eafe51"} Dec 04 10:05:52 crc kubenswrapper[4693]: I1204 10:05:52.112757 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 10:05:52 crc kubenswrapper[4693]: I1204 10:05:52.200619 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 10:05:56 crc kubenswrapper[4693]: I1204 10:05:56.898415 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" podUID="d7a457be-015b-44a1-b55c-d0254008b53f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.144:5353: connect: connection refused" Dec 04 10:06:01 crc kubenswrapper[4693]: I1204 10:06:01.575256 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:06:01 crc kubenswrapper[4693]: E1204 10:06:01.597625 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Dec 04 10:06:01 crc kubenswrapper[4693]: E1204 10:06:01.597967 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pz5xm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(edf292dd-e2f4-4e80-ab9a-3e548118489a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 10:06:01 crc kubenswrapper[4693]: E1204 10:06:01.604608 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="edf292dd-e2f4-4e80-ab9a-3e548118489a" Dec 04 10:06:01 crc kubenswrapper[4693]: I1204 10:06:01.629661 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-596dc75986-wjgrk" Dec 04 10:06:01 crc kubenswrapper[4693]: I1204 10:06:01.875285 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:06:01 crc kubenswrapper[4693]: I1204 10:06:01.990721 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-ovsdbserver-sb\") pod \"d7a457be-015b-44a1-b55c-d0254008b53f\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " Dec 04 10:06:01 crc kubenswrapper[4693]: I1204 10:06:01.990800 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m6hr\" (UniqueName: \"kubernetes.io/projected/d7a457be-015b-44a1-b55c-d0254008b53f-kube-api-access-7m6hr\") pod \"d7a457be-015b-44a1-b55c-d0254008b53f\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " Dec 04 10:06:01 crc kubenswrapper[4693]: I1204 10:06:01.990863 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-config\") pod \"d7a457be-015b-44a1-b55c-d0254008b53f\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " Dec 04 10:06:01 crc kubenswrapper[4693]: I1204 10:06:01.990921 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-dns-svc\") pod \"d7a457be-015b-44a1-b55c-d0254008b53f\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " Dec 04 10:06:01 crc kubenswrapper[4693]: I1204 10:06:01.991114 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-ovsdbserver-nb\") pod \"d7a457be-015b-44a1-b55c-d0254008b53f\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " Dec 04 10:06:01 crc kubenswrapper[4693]: I1204 10:06:01.991748 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-dns-swift-storage-0\") pod \"d7a457be-015b-44a1-b55c-d0254008b53f\" (UID: \"d7a457be-015b-44a1-b55c-d0254008b53f\") " Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.002954 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7a457be-015b-44a1-b55c-d0254008b53f-kube-api-access-7m6hr" (OuterVolumeSpecName: "kube-api-access-7m6hr") pod "d7a457be-015b-44a1-b55c-d0254008b53f" (UID: "d7a457be-015b-44a1-b55c-d0254008b53f"). InnerVolumeSpecName "kube-api-access-7m6hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.045353 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="edf292dd-e2f4-4e80-ab9a-3e548118489a" containerName="ceilometer-notification-agent" containerID="cri-o://abbc17a2851c37f8bc1cc8459cd7a0c1f319183cde18e70e698a76638d199c1c" gracePeriod=30 Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.045805 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="edf292dd-e2f4-4e80-ab9a-3e548118489a" containerName="sg-core" containerID="cri-o://03b0ae154874e27f4fd270f8d73e693e94e8793a55b2936871d9df8e22d541fb" gracePeriod=30 Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.046166 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" event={"ID":"d7a457be-015b-44a1-b55c-d0254008b53f","Type":"ContainerDied","Data":"6b46f750a89e37df6b68986ecfe97c5373d50fbc0a79568c6b6a61cc1d3ceaf8"} Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.046179 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d7a457be-015b-44a1-b55c-d0254008b53f" (UID: "d7a457be-015b-44a1-b55c-d0254008b53f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.046212 4693 scope.go:117] "RemoveContainer" containerID="e5dedb9ba7c84c0ecc079aba1c170dec64e90bed927ebba1c4091fe857eafe51" Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.049380 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-mlf2h" Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.053094 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d7a457be-015b-44a1-b55c-d0254008b53f" (UID: "d7a457be-015b-44a1-b55c-d0254008b53f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.071765 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d7a457be-015b-44a1-b55c-d0254008b53f" (UID: "d7a457be-015b-44a1-b55c-d0254008b53f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.077657 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-config" (OuterVolumeSpecName: "config") pod "d7a457be-015b-44a1-b55c-d0254008b53f" (UID: "d7a457be-015b-44a1-b55c-d0254008b53f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.092778 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d7a457be-015b-44a1-b55c-d0254008b53f" (UID: "d7a457be-015b-44a1-b55c-d0254008b53f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.094744 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.094781 4693 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.094800 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.094818 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m6hr\" (UniqueName: \"kubernetes.io/projected/d7a457be-015b-44a1-b55c-d0254008b53f-kube-api-access-7m6hr\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.094835 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.094850 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7a457be-015b-44a1-b55c-d0254008b53f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.119860 4693 scope.go:117] "RemoveContainer" containerID="af2fefd302a70b31602db0fcbd43f23149a6b5efa86648d1f2fbfc800903d463" Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.382964 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-mlf2h"] Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.390212 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-mlf2h"] Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.471862 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7a457be-015b-44a1-b55c-d0254008b53f" path="/var/lib/kubelet/pods/d7a457be-015b-44a1-b55c-d0254008b53f/volumes" Dec 04 10:06:02 crc kubenswrapper[4693]: I1204 10:06:02.704588 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7f774bdc67-hjxts" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.056697 4693 generic.go:334] "Generic (PLEG): container finished" podID="edf292dd-e2f4-4e80-ab9a-3e548118489a" containerID="03b0ae154874e27f4fd270f8d73e693e94e8793a55b2936871d9df8e22d541fb" exitCode=2 Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.056884 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edf292dd-e2f4-4e80-ab9a-3e548118489a","Type":"ContainerDied","Data":"03b0ae154874e27f4fd270f8d73e693e94e8793a55b2936871d9df8e22d541fb"} Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.422200 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 04 10:06:03 crc kubenswrapper[4693]: E1204 10:06:03.423256 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9928acda-0163-4fde-8635-f861c13c43fb" containerName="neutron-httpd" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.423402 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9928acda-0163-4fde-8635-f861c13c43fb" containerName="neutron-httpd" Dec 04 10:06:03 crc kubenswrapper[4693]: E1204 10:06:03.423502 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9928acda-0163-4fde-8635-f861c13c43fb" containerName="neutron-api" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.423576 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9928acda-0163-4fde-8635-f861c13c43fb" containerName="neutron-api" Dec 04 10:06:03 crc kubenswrapper[4693]: E1204 10:06:03.423647 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a457be-015b-44a1-b55c-d0254008b53f" containerName="dnsmasq-dns" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.423713 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a457be-015b-44a1-b55c-d0254008b53f" containerName="dnsmasq-dns" Dec 04 10:06:03 crc kubenswrapper[4693]: E1204 10:06:03.423785 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a457be-015b-44a1-b55c-d0254008b53f" containerName="init" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.423850 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a457be-015b-44a1-b55c-d0254008b53f" containerName="init" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.424164 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="9928acda-0163-4fde-8635-f861c13c43fb" containerName="neutron-api" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.424252 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a457be-015b-44a1-b55c-d0254008b53f" containerName="dnsmasq-dns" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.424520 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="9928acda-0163-4fde-8635-f861c13c43fb" containerName="neutron-httpd" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.425450 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.427632 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.428015 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.428976 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-pvjqr" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.433867 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.517618 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2fefcbd-5f7f-4544-8f03-49adbe23a11b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b2fefcbd-5f7f-4544-8f03-49adbe23a11b\") " pod="openstack/openstackclient" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.517670 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b2fefcbd-5f7f-4544-8f03-49adbe23a11b-openstack-config\") pod \"openstackclient\" (UID: \"b2fefcbd-5f7f-4544-8f03-49adbe23a11b\") " pod="openstack/openstackclient" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.517790 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b2fefcbd-5f7f-4544-8f03-49adbe23a11b-openstack-config-secret\") pod \"openstackclient\" (UID: \"b2fefcbd-5f7f-4544-8f03-49adbe23a11b\") " pod="openstack/openstackclient" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.517834 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fr5t\" (UniqueName: \"kubernetes.io/projected/b2fefcbd-5f7f-4544-8f03-49adbe23a11b-kube-api-access-6fr5t\") pod \"openstackclient\" (UID: \"b2fefcbd-5f7f-4544-8f03-49adbe23a11b\") " pod="openstack/openstackclient" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.619284 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fr5t\" (UniqueName: \"kubernetes.io/projected/b2fefcbd-5f7f-4544-8f03-49adbe23a11b-kube-api-access-6fr5t\") pod \"openstackclient\" (UID: \"b2fefcbd-5f7f-4544-8f03-49adbe23a11b\") " pod="openstack/openstackclient" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.619442 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2fefcbd-5f7f-4544-8f03-49adbe23a11b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b2fefcbd-5f7f-4544-8f03-49adbe23a11b\") " pod="openstack/openstackclient" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.619480 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b2fefcbd-5f7f-4544-8f03-49adbe23a11b-openstack-config\") pod \"openstackclient\" (UID: \"b2fefcbd-5f7f-4544-8f03-49adbe23a11b\") " pod="openstack/openstackclient" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.619566 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b2fefcbd-5f7f-4544-8f03-49adbe23a11b-openstack-config-secret\") pod \"openstackclient\" (UID: \"b2fefcbd-5f7f-4544-8f03-49adbe23a11b\") " pod="openstack/openstackclient" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.620616 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b2fefcbd-5f7f-4544-8f03-49adbe23a11b-openstack-config\") pod \"openstackclient\" (UID: \"b2fefcbd-5f7f-4544-8f03-49adbe23a11b\") " pod="openstack/openstackclient" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.625755 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b2fefcbd-5f7f-4544-8f03-49adbe23a11b-openstack-config-secret\") pod \"openstackclient\" (UID: \"b2fefcbd-5f7f-4544-8f03-49adbe23a11b\") " pod="openstack/openstackclient" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.626035 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2fefcbd-5f7f-4544-8f03-49adbe23a11b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b2fefcbd-5f7f-4544-8f03-49adbe23a11b\") " pod="openstack/openstackclient" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.636534 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fr5t\" (UniqueName: \"kubernetes.io/projected/b2fefcbd-5f7f-4544-8f03-49adbe23a11b-kube-api-access-6fr5t\") pod \"openstackclient\" (UID: \"b2fefcbd-5f7f-4544-8f03-49adbe23a11b\") " pod="openstack/openstackclient" Dec 04 10:06:03 crc kubenswrapper[4693]: I1204 10:06:03.753635 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 10:06:04 crc kubenswrapper[4693]: I1204 10:06:04.083235 4693 generic.go:334] "Generic (PLEG): container finished" podID="a62cb864-103c-4b89-afeb-8397af4046cb" containerID="714f5dd6229837b0c880729a145472584d9396fcefc23b643b9687974c4c9ffe" exitCode=0 Dec 04 10:06:04 crc kubenswrapper[4693]: I1204 10:06:04.083382 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-p69cr" event={"ID":"a62cb864-103c-4b89-afeb-8397af4046cb","Type":"ContainerDied","Data":"714f5dd6229837b0c880729a145472584d9396fcefc23b643b9687974c4c9ffe"} Dec 04 10:06:04 crc kubenswrapper[4693]: I1204 10:06:04.225376 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 10:06:05 crc kubenswrapper[4693]: I1204 10:06:05.101703 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b2fefcbd-5f7f-4544-8f03-49adbe23a11b","Type":"ContainerStarted","Data":"b6f1abf5935f17d33b85dfcaca139f1fb57d3c78e6409563bbe491ff6c5aa2f1"} Dec 04 10:06:05 crc kubenswrapper[4693]: I1204 10:06:05.528500 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-p69cr" Dec 04 10:06:05 crc kubenswrapper[4693]: I1204 10:06:05.656194 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62cb864-103c-4b89-afeb-8397af4046cb-combined-ca-bundle\") pod \"a62cb864-103c-4b89-afeb-8397af4046cb\" (UID: \"a62cb864-103c-4b89-afeb-8397af4046cb\") " Dec 04 10:06:05 crc kubenswrapper[4693]: I1204 10:06:05.656779 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tmz2\" (UniqueName: \"kubernetes.io/projected/a62cb864-103c-4b89-afeb-8397af4046cb-kube-api-access-6tmz2\") pod \"a62cb864-103c-4b89-afeb-8397af4046cb\" (UID: \"a62cb864-103c-4b89-afeb-8397af4046cb\") " Dec 04 10:06:05 crc kubenswrapper[4693]: I1204 10:06:05.656883 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a62cb864-103c-4b89-afeb-8397af4046cb-db-sync-config-data\") pod \"a62cb864-103c-4b89-afeb-8397af4046cb\" (UID: \"a62cb864-103c-4b89-afeb-8397af4046cb\") " Dec 04 10:06:05 crc kubenswrapper[4693]: I1204 10:06:05.661570 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a62cb864-103c-4b89-afeb-8397af4046cb-kube-api-access-6tmz2" (OuterVolumeSpecName: "kube-api-access-6tmz2") pod "a62cb864-103c-4b89-afeb-8397af4046cb" (UID: "a62cb864-103c-4b89-afeb-8397af4046cb"). InnerVolumeSpecName "kube-api-access-6tmz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:05 crc kubenswrapper[4693]: I1204 10:06:05.661827 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a62cb864-103c-4b89-afeb-8397af4046cb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a62cb864-103c-4b89-afeb-8397af4046cb" (UID: "a62cb864-103c-4b89-afeb-8397af4046cb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:05 crc kubenswrapper[4693]: I1204 10:06:05.687616 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a62cb864-103c-4b89-afeb-8397af4046cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a62cb864-103c-4b89-afeb-8397af4046cb" (UID: "a62cb864-103c-4b89-afeb-8397af4046cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:05 crc kubenswrapper[4693]: I1204 10:06:05.758698 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62cb864-103c-4b89-afeb-8397af4046cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:05 crc kubenswrapper[4693]: I1204 10:06:05.758902 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tmz2\" (UniqueName: \"kubernetes.io/projected/a62cb864-103c-4b89-afeb-8397af4046cb-kube-api-access-6tmz2\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:05 crc kubenswrapper[4693]: I1204 10:06:05.758992 4693 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a62cb864-103c-4b89-afeb-8397af4046cb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.117568 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-p69cr" event={"ID":"a62cb864-103c-4b89-afeb-8397af4046cb","Type":"ContainerDied","Data":"ec4415adebedb9e0cf252ebac162686d060affc48b4259f371212f89081a7465"} Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.117604 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec4415adebedb9e0cf252ebac162686d060affc48b4259f371212f89081a7465" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.117656 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-p69cr" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.381144 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-76bdf94f96-jnvk8"] Dec 04 10:06:06 crc kubenswrapper[4693]: E1204 10:06:06.391515 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a62cb864-103c-4b89-afeb-8397af4046cb" containerName="barbican-db-sync" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.391569 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a62cb864-103c-4b89-afeb-8397af4046cb" containerName="barbican-db-sync" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.393619 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a62cb864-103c-4b89-afeb-8397af4046cb" containerName="barbican-db-sync" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.399181 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.404171 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.404288 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qc2xk" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.416214 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.416781 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-76bdf94f96-jnvk8"] Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.454583 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-d7c5fcd89-hgn5p"] Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.458002 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-d7c5fcd89-hgn5p" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.497896 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.509544 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-d7c5fcd89-hgn5p"] Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.521914 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340-logs\") pod \"barbican-worker-d7c5fcd89-hgn5p\" (UID: \"ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340\") " pod="openstack/barbican-worker-d7c5fcd89-hgn5p" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.522037 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340-config-data-custom\") pod \"barbican-worker-d7c5fcd89-hgn5p\" (UID: \"ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340\") " pod="openstack/barbican-worker-d7c5fcd89-hgn5p" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.522358 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340-config-data\") pod \"barbican-worker-d7c5fcd89-hgn5p\" (UID: \"ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340\") " pod="openstack/barbican-worker-d7c5fcd89-hgn5p" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.522382 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtt9n\" (UniqueName: \"kubernetes.io/projected/ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340-kube-api-access-jtt9n\") pod \"barbican-worker-d7c5fcd89-hgn5p\" (UID: \"ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340\") " pod="openstack/barbican-worker-d7c5fcd89-hgn5p" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.522409 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340-combined-ca-bundle\") pod \"barbican-worker-d7c5fcd89-hgn5p\" (UID: \"ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340\") " pod="openstack/barbican-worker-d7c5fcd89-hgn5p" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.551988 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-kwjv6"] Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.553999 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.567889 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-kwjv6"] Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.627050 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-kwjv6\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.627113 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-kwjv6\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.627172 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340-logs\") pod \"barbican-worker-d7c5fcd89-hgn5p\" (UID: \"ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340\") " pod="openstack/barbican-worker-d7c5fcd89-hgn5p" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.627198 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e00b6b-bd3b-4198-8333-1515f919cbfc-combined-ca-bundle\") pod \"barbican-keystone-listener-76bdf94f96-jnvk8\" (UID: \"33e00b6b-bd3b-4198-8333-1515f919cbfc\") " pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.627227 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33e00b6b-bd3b-4198-8333-1515f919cbfc-logs\") pod \"barbican-keystone-listener-76bdf94f96-jnvk8\" (UID: \"33e00b6b-bd3b-4198-8333-1515f919cbfc\") " pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.627261 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340-config-data-custom\") pod \"barbican-worker-d7c5fcd89-hgn5p\" (UID: \"ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340\") " pod="openstack/barbican-worker-d7c5fcd89-hgn5p" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.627300 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-kwjv6\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.627322 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33e00b6b-bd3b-4198-8333-1515f919cbfc-config-data\") pod \"barbican-keystone-listener-76bdf94f96-jnvk8\" (UID: \"33e00b6b-bd3b-4198-8333-1515f919cbfc\") " pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.627373 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-kwjv6\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.627394 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9f76\" (UniqueName: \"kubernetes.io/projected/33e00b6b-bd3b-4198-8333-1515f919cbfc-kube-api-access-v9f76\") pod \"barbican-keystone-listener-76bdf94f96-jnvk8\" (UID: \"33e00b6b-bd3b-4198-8333-1515f919cbfc\") " pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.627418 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzf5r\" (UniqueName: \"kubernetes.io/projected/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-kube-api-access-xzf5r\") pod \"dnsmasq-dns-848cf88cfc-kwjv6\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.627450 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340-config-data\") pod \"barbican-worker-d7c5fcd89-hgn5p\" (UID: \"ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340\") " pod="openstack/barbican-worker-d7c5fcd89-hgn5p" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.627467 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtt9n\" (UniqueName: \"kubernetes.io/projected/ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340-kube-api-access-jtt9n\") pod \"barbican-worker-d7c5fcd89-hgn5p\" (UID: \"ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340\") " pod="openstack/barbican-worker-d7c5fcd89-hgn5p" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.627504 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340-combined-ca-bundle\") pod \"barbican-worker-d7c5fcd89-hgn5p\" (UID: \"ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340\") " pod="openstack/barbican-worker-d7c5fcd89-hgn5p" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.627531 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-config\") pod \"dnsmasq-dns-848cf88cfc-kwjv6\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.627555 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33e00b6b-bd3b-4198-8333-1515f919cbfc-config-data-custom\") pod \"barbican-keystone-listener-76bdf94f96-jnvk8\" (UID: \"33e00b6b-bd3b-4198-8333-1515f919cbfc\") " pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.627990 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340-logs\") pod \"barbican-worker-d7c5fcd89-hgn5p\" (UID: \"ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340\") " pod="openstack/barbican-worker-d7c5fcd89-hgn5p" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.639754 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340-config-data\") pod \"barbican-worker-d7c5fcd89-hgn5p\" (UID: \"ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340\") " pod="openstack/barbican-worker-d7c5fcd89-hgn5p" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.650009 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340-combined-ca-bundle\") pod \"barbican-worker-d7c5fcd89-hgn5p\" (UID: \"ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340\") " pod="openstack/barbican-worker-d7c5fcd89-hgn5p" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.651149 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340-config-data-custom\") pod \"barbican-worker-d7c5fcd89-hgn5p\" (UID: \"ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340\") " pod="openstack/barbican-worker-d7c5fcd89-hgn5p" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.655573 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtt9n\" (UniqueName: \"kubernetes.io/projected/ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340-kube-api-access-jtt9n\") pod \"barbican-worker-d7c5fcd89-hgn5p\" (UID: \"ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340\") " pod="openstack/barbican-worker-d7c5fcd89-hgn5p" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.672820 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-d6f6d8ffd-tv96v"] Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.675891 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.679161 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.703452 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d6f6d8ffd-tv96v"] Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.730011 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzf5r\" (UniqueName: \"kubernetes.io/projected/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-kube-api-access-xzf5r\") pod \"dnsmasq-dns-848cf88cfc-kwjv6\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.730090 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-config\") pod \"dnsmasq-dns-848cf88cfc-kwjv6\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.730120 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33e00b6b-bd3b-4198-8333-1515f919cbfc-config-data-custom\") pod \"barbican-keystone-listener-76bdf94f96-jnvk8\" (UID: \"33e00b6b-bd3b-4198-8333-1515f919cbfc\") " pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.730154 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-config-data-custom\") pod \"barbican-api-d6f6d8ffd-tv96v\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.730185 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-kwjv6\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.730204 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-kwjv6\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.730222 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-config-data\") pod \"barbican-api-d6f6d8ffd-tv96v\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.730296 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e00b6b-bd3b-4198-8333-1515f919cbfc-combined-ca-bundle\") pod \"barbican-keystone-listener-76bdf94f96-jnvk8\" (UID: \"33e00b6b-bd3b-4198-8333-1515f919cbfc\") " pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.730320 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-combined-ca-bundle\") pod \"barbican-api-d6f6d8ffd-tv96v\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.730364 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33e00b6b-bd3b-4198-8333-1515f919cbfc-logs\") pod \"barbican-keystone-listener-76bdf94f96-jnvk8\" (UID: \"33e00b6b-bd3b-4198-8333-1515f919cbfc\") " pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.730386 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvsbv\" (UniqueName: \"kubernetes.io/projected/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-kube-api-access-mvsbv\") pod \"barbican-api-d6f6d8ffd-tv96v\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.730430 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-kwjv6\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.730445 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-logs\") pod \"barbican-api-d6f6d8ffd-tv96v\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.730464 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33e00b6b-bd3b-4198-8333-1515f919cbfc-config-data\") pod \"barbican-keystone-listener-76bdf94f96-jnvk8\" (UID: \"33e00b6b-bd3b-4198-8333-1515f919cbfc\") " pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.730492 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-kwjv6\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.730510 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9f76\" (UniqueName: \"kubernetes.io/projected/33e00b6b-bd3b-4198-8333-1515f919cbfc-kube-api-access-v9f76\") pod \"barbican-keystone-listener-76bdf94f96-jnvk8\" (UID: \"33e00b6b-bd3b-4198-8333-1515f919cbfc\") " pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.731857 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-kwjv6\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.731871 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-kwjv6\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.732248 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33e00b6b-bd3b-4198-8333-1515f919cbfc-logs\") pod \"barbican-keystone-listener-76bdf94f96-jnvk8\" (UID: \"33e00b6b-bd3b-4198-8333-1515f919cbfc\") " pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.732645 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-config\") pod \"dnsmasq-dns-848cf88cfc-kwjv6\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.733585 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-kwjv6\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.734419 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-kwjv6\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.738527 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33e00b6b-bd3b-4198-8333-1515f919cbfc-config-data\") pod \"barbican-keystone-listener-76bdf94f96-jnvk8\" (UID: \"33e00b6b-bd3b-4198-8333-1515f919cbfc\") " pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.741356 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33e00b6b-bd3b-4198-8333-1515f919cbfc-config-data-custom\") pod \"barbican-keystone-listener-76bdf94f96-jnvk8\" (UID: \"33e00b6b-bd3b-4198-8333-1515f919cbfc\") " pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.742142 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e00b6b-bd3b-4198-8333-1515f919cbfc-combined-ca-bundle\") pod \"barbican-keystone-listener-76bdf94f96-jnvk8\" (UID: \"33e00b6b-bd3b-4198-8333-1515f919cbfc\") " pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.762441 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzf5r\" (UniqueName: \"kubernetes.io/projected/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-kube-api-access-xzf5r\") pod \"dnsmasq-dns-848cf88cfc-kwjv6\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.778313 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9f76\" (UniqueName: \"kubernetes.io/projected/33e00b6b-bd3b-4198-8333-1515f919cbfc-kube-api-access-v9f76\") pod \"barbican-keystone-listener-76bdf94f96-jnvk8\" (UID: \"33e00b6b-bd3b-4198-8333-1515f919cbfc\") " pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.830881 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-d7c5fcd89-hgn5p" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.832573 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-config-data-custom\") pod \"barbican-api-d6f6d8ffd-tv96v\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.832649 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-config-data\") pod \"barbican-api-d6f6d8ffd-tv96v\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.832692 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-combined-ca-bundle\") pod \"barbican-api-d6f6d8ffd-tv96v\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.832729 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvsbv\" (UniqueName: \"kubernetes.io/projected/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-kube-api-access-mvsbv\") pod \"barbican-api-d6f6d8ffd-tv96v\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.832776 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-logs\") pod \"barbican-api-d6f6d8ffd-tv96v\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.833278 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-logs\") pod \"barbican-api-d6f6d8ffd-tv96v\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.838387 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-combined-ca-bundle\") pod \"barbican-api-d6f6d8ffd-tv96v\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.841965 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-config-data-custom\") pod \"barbican-api-d6f6d8ffd-tv96v\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.842726 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-config-data\") pod \"barbican-api-d6f6d8ffd-tv96v\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.856284 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvsbv\" (UniqueName: \"kubernetes.io/projected/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-kube-api-access-mvsbv\") pod \"barbican-api-d6f6d8ffd-tv96v\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.865966 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:06 crc kubenswrapper[4693]: I1204 10:06:06.884553 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:07 crc kubenswrapper[4693]: I1204 10:06:07.060679 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" Dec 04 10:06:07 crc kubenswrapper[4693]: I1204 10:06:07.325886 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-d7c5fcd89-hgn5p"] Dec 04 10:06:07 crc kubenswrapper[4693]: I1204 10:06:07.427673 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d6f6d8ffd-tv96v"] Dec 04 10:06:07 crc kubenswrapper[4693]: I1204 10:06:07.597177 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-76bdf94f96-jnvk8"] Dec 04 10:06:07 crc kubenswrapper[4693]: W1204 10:06:07.600202 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33e00b6b_bd3b_4198_8333_1515f919cbfc.slice/crio-1398271282a0904ef51abfdde72e9e4bae4d4bff374f0983528f19fd37295a32 WatchSource:0}: Error finding container 1398271282a0904ef51abfdde72e9e4bae4d4bff374f0983528f19fd37295a32: Status 404 returned error can't find the container with id 1398271282a0904ef51abfdde72e9e4bae4d4bff374f0983528f19fd37295a32 Dec 04 10:06:08 crc kubenswrapper[4693]: I1204 10:06:08.148863 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d6f6d8ffd-tv96v" event={"ID":"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9","Type":"ContainerStarted","Data":"e1e568e454553cec9a27f463fb6404bcc7b8cb42c46760148bbfe3bd0c5a3728"} Dec 04 10:06:08 crc kubenswrapper[4693]: I1204 10:06:08.151603 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-d7c5fcd89-hgn5p" event={"ID":"ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340","Type":"ContainerStarted","Data":"046ab717875f996d3f94879b53129c60c01022e8259beda885e728580e7eda6c"} Dec 04 10:06:08 crc kubenswrapper[4693]: I1204 10:06:08.152157 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" event={"ID":"33e00b6b-bd3b-4198-8333-1515f919cbfc","Type":"ContainerStarted","Data":"1398271282a0904ef51abfdde72e9e4bae4d4bff374f0983528f19fd37295a32"} Dec 04 10:06:09 crc kubenswrapper[4693]: W1204 10:06:09.008455 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc964bd03_cc2a_461a_a3ad_3e8118ed8a82.slice/crio-eb1382ad5b6b5ea6fae11f4a78942f6b2a23c17e49452098efdf9ce049ac9b84 WatchSource:0}: Error finding container eb1382ad5b6b5ea6fae11f4a78942f6b2a23c17e49452098efdf9ce049ac9b84: Status 404 returned error can't find the container with id eb1382ad5b6b5ea6fae11f4a78942f6b2a23c17e49452098efdf9ce049ac9b84 Dec 04 10:06:09 crc kubenswrapper[4693]: I1204 10:06:09.011340 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-kwjv6"] Dec 04 10:06:09 crc kubenswrapper[4693]: I1204 10:06:09.185030 4693 generic.go:334] "Generic (PLEG): container finished" podID="edf292dd-e2f4-4e80-ab9a-3e548118489a" containerID="abbc17a2851c37f8bc1cc8459cd7a0c1f319183cde18e70e698a76638d199c1c" exitCode=0 Dec 04 10:06:09 crc kubenswrapper[4693]: I1204 10:06:09.185165 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edf292dd-e2f4-4e80-ab9a-3e548118489a","Type":"ContainerDied","Data":"abbc17a2851c37f8bc1cc8459cd7a0c1f319183cde18e70e698a76638d199c1c"} Dec 04 10:06:09 crc kubenswrapper[4693]: I1204 10:06:09.191201 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d6f6d8ffd-tv96v" event={"ID":"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9","Type":"ContainerStarted","Data":"87cf49e446b0b4b93cd6728145979739363d1b230f474bc59c7efaa8099f23ae"} Dec 04 10:06:09 crc kubenswrapper[4693]: I1204 10:06:09.197651 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" event={"ID":"c964bd03-cc2a-461a-a3ad-3e8118ed8a82","Type":"ContainerStarted","Data":"eb1382ad5b6b5ea6fae11f4a78942f6b2a23c17e49452098efdf9ce049ac9b84"} Dec 04 10:06:09 crc kubenswrapper[4693]: I1204 10:06:09.218126 4693 generic.go:334] "Generic (PLEG): container finished" podID="5d096d1f-bf52-413c-9cb9-4c89179e5725" containerID="f1cd933525400a784d0d5d17a18b7c228b22829e893d0fb79dbba4b73a786655" exitCode=137 Dec 04 10:06:09 crc kubenswrapper[4693]: I1204 10:06:09.218175 4693 generic.go:334] "Generic (PLEG): container finished" podID="5d096d1f-bf52-413c-9cb9-4c89179e5725" containerID="e98854c320adffc66ab9b3f27212c1efb5e2d439a4194f947da8e97aa18a8e59" exitCode=137 Dec 04 10:06:09 crc kubenswrapper[4693]: I1204 10:06:09.218202 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7649787bc6-fddzc" event={"ID":"5d096d1f-bf52-413c-9cb9-4c89179e5725","Type":"ContainerDied","Data":"f1cd933525400a784d0d5d17a18b7c228b22829e893d0fb79dbba4b73a786655"} Dec 04 10:06:09 crc kubenswrapper[4693]: I1204 10:06:09.218242 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7649787bc6-fddzc" event={"ID":"5d096d1f-bf52-413c-9cb9-4c89179e5725","Type":"ContainerDied","Data":"e98854c320adffc66ab9b3f27212c1efb5e2d439a4194f947da8e97aa18a8e59"} Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.044243 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5dcd5dc8-znmkm"] Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.050613 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.059620 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.060595 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5dcd5dc8-znmkm"] Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.062217 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.120901 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.165840 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d096d1f-bf52-413c-9cb9-4c89179e5725-config-data\") pod \"5d096d1f-bf52-413c-9cb9-4c89179e5725\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.166191 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99z4b\" (UniqueName: \"kubernetes.io/projected/5d096d1f-bf52-413c-9cb9-4c89179e5725-kube-api-access-99z4b\") pod \"5d096d1f-bf52-413c-9cb9-4c89179e5725\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.166251 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d096d1f-bf52-413c-9cb9-4c89179e5725-combined-ca-bundle\") pod \"5d096d1f-bf52-413c-9cb9-4c89179e5725\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.166375 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d096d1f-bf52-413c-9cb9-4c89179e5725-logs\") pod \"5d096d1f-bf52-413c-9cb9-4c89179e5725\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.166489 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d096d1f-bf52-413c-9cb9-4c89179e5725-scripts\") pod \"5d096d1f-bf52-413c-9cb9-4c89179e5725\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.167515 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d096d1f-bf52-413c-9cb9-4c89179e5725-horizon-tls-certs\") pod \"5d096d1f-bf52-413c-9cb9-4c89179e5725\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.167562 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d096d1f-bf52-413c-9cb9-4c89179e5725-horizon-secret-key\") pod \"5d096d1f-bf52-413c-9cb9-4c89179e5725\" (UID: \"5d096d1f-bf52-413c-9cb9-4c89179e5725\") " Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.167877 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5chw\" (UniqueName: \"kubernetes.io/projected/c8aed54b-9500-4b6d-a966-64fb3cff7b45-kube-api-access-j5chw\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.167963 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8aed54b-9500-4b6d-a966-64fb3cff7b45-internal-tls-certs\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.167986 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8aed54b-9500-4b6d-a966-64fb3cff7b45-logs\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.168084 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8aed54b-9500-4b6d-a966-64fb3cff7b45-config-data-custom\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.168121 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8aed54b-9500-4b6d-a966-64fb3cff7b45-combined-ca-bundle\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.168144 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8aed54b-9500-4b6d-a966-64fb3cff7b45-public-tls-certs\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.168168 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8aed54b-9500-4b6d-a966-64fb3cff7b45-config-data\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.169557 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d096d1f-bf52-413c-9cb9-4c89179e5725-logs" (OuterVolumeSpecName: "logs") pod "5d096d1f-bf52-413c-9cb9-4c89179e5725" (UID: "5d096d1f-bf52-413c-9cb9-4c89179e5725"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.177440 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d096d1f-bf52-413c-9cb9-4c89179e5725-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5d096d1f-bf52-413c-9cb9-4c89179e5725" (UID: "5d096d1f-bf52-413c-9cb9-4c89179e5725"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.177971 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d096d1f-bf52-413c-9cb9-4c89179e5725-kube-api-access-99z4b" (OuterVolumeSpecName: "kube-api-access-99z4b") pod "5d096d1f-bf52-413c-9cb9-4c89179e5725" (UID: "5d096d1f-bf52-413c-9cb9-4c89179e5725"). InnerVolumeSpecName "kube-api-access-99z4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.226041 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d096d1f-bf52-413c-9cb9-4c89179e5725-scripts" (OuterVolumeSpecName: "scripts") pod "5d096d1f-bf52-413c-9cb9-4c89179e5725" (UID: "5d096d1f-bf52-413c-9cb9-4c89179e5725"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.257516 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d096d1f-bf52-413c-9cb9-4c89179e5725-config-data" (OuterVolumeSpecName: "config-data") pod "5d096d1f-bf52-413c-9cb9-4c89179e5725" (UID: "5d096d1f-bf52-413c-9cb9-4c89179e5725"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.263913 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7649787bc6-fddzc" event={"ID":"5d096d1f-bf52-413c-9cb9-4c89179e5725","Type":"ContainerDied","Data":"60f84246054c776836d04a6e91722840cd292470a9ca6050d3f9e2f19353f533"} Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.263988 4693 scope.go:117] "RemoveContainer" containerID="f1cd933525400a784d0d5d17a18b7c228b22829e893d0fb79dbba4b73a786655" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.264180 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7649787bc6-fddzc" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.265001 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d096d1f-bf52-413c-9cb9-4c89179e5725-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d096d1f-bf52-413c-9cb9-4c89179e5725" (UID: "5d096d1f-bf52-413c-9cb9-4c89179e5725"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.273096 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5chw\" (UniqueName: \"kubernetes.io/projected/c8aed54b-9500-4b6d-a966-64fb3cff7b45-kube-api-access-j5chw\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.273182 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8aed54b-9500-4b6d-a966-64fb3cff7b45-internal-tls-certs\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.273206 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8aed54b-9500-4b6d-a966-64fb3cff7b45-logs\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.273262 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8aed54b-9500-4b6d-a966-64fb3cff7b45-config-data-custom\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.273288 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8aed54b-9500-4b6d-a966-64fb3cff7b45-combined-ca-bundle\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.273311 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8aed54b-9500-4b6d-a966-64fb3cff7b45-public-tls-certs\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.273529 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8aed54b-9500-4b6d-a966-64fb3cff7b45-config-data\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.273595 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d096d1f-bf52-413c-9cb9-4c89179e5725-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.273607 4693 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d096d1f-bf52-413c-9cb9-4c89179e5725-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.273620 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d096d1f-bf52-413c-9cb9-4c89179e5725-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.273629 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99z4b\" (UniqueName: \"kubernetes.io/projected/5d096d1f-bf52-413c-9cb9-4c89179e5725-kube-api-access-99z4b\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.273638 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d096d1f-bf52-413c-9cb9-4c89179e5725-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.273646 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d096d1f-bf52-413c-9cb9-4c89179e5725-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.274867 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8aed54b-9500-4b6d-a966-64fb3cff7b45-logs\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.286347 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8aed54b-9500-4b6d-a966-64fb3cff7b45-public-tls-certs\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.290932 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8aed54b-9500-4b6d-a966-64fb3cff7b45-internal-tls-certs\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.292135 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8aed54b-9500-4b6d-a966-64fb3cff7b45-config-data-custom\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.308512 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d096d1f-bf52-413c-9cb9-4c89179e5725-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "5d096d1f-bf52-413c-9cb9-4c89179e5725" (UID: "5d096d1f-bf52-413c-9cb9-4c89179e5725"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.313692 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8aed54b-9500-4b6d-a966-64fb3cff7b45-config-data\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.313708 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5chw\" (UniqueName: \"kubernetes.io/projected/c8aed54b-9500-4b6d-a966-64fb3cff7b45-kube-api-access-j5chw\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.314423 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8aed54b-9500-4b6d-a966-64fb3cff7b45-combined-ca-bundle\") pod \"barbican-api-5dcd5dc8-znmkm\" (UID: \"c8aed54b-9500-4b6d-a966-64fb3cff7b45\") " pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.321594 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.322134 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edf292dd-e2f4-4e80-ab9a-3e548118489a","Type":"ContainerDied","Data":"94dd8bf202506c2751dcbc9c3a21f70d2ea17bc2df286815d3fa2720fee25c2c"} Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.333683 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d6f6d8ffd-tv96v" event={"ID":"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9","Type":"ContainerStarted","Data":"db046315a0cfd86af952ec142e9da3383d728fd84157270abdca664980541dd9"} Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.335424 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.335462 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.349378 4693 generic.go:334] "Generic (PLEG): container finished" podID="c964bd03-cc2a-461a-a3ad-3e8118ed8a82" containerID="e384047c246a464a459f3934942776cded13bdb67e07d365f3f5311826c36834" exitCode=0 Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.349443 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" event={"ID":"c964bd03-cc2a-461a-a3ad-3e8118ed8a82","Type":"ContainerDied","Data":"e384047c246a464a459f3934942776cded13bdb67e07d365f3f5311826c36834"} Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.375068 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz5xm\" (UniqueName: \"kubernetes.io/projected/edf292dd-e2f4-4e80-ab9a-3e548118489a-kube-api-access-pz5xm\") pod \"edf292dd-e2f4-4e80-ab9a-3e548118489a\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.375271 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-combined-ca-bundle\") pod \"edf292dd-e2f4-4e80-ab9a-3e548118489a\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.375374 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-scripts\") pod \"edf292dd-e2f4-4e80-ab9a-3e548118489a\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.375446 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edf292dd-e2f4-4e80-ab9a-3e548118489a-log-httpd\") pod \"edf292dd-e2f4-4e80-ab9a-3e548118489a\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.375497 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-config-data\") pod \"edf292dd-e2f4-4e80-ab9a-3e548118489a\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.375559 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edf292dd-e2f4-4e80-ab9a-3e548118489a-run-httpd\") pod \"edf292dd-e2f4-4e80-ab9a-3e548118489a\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.375694 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-sg-core-conf-yaml\") pod \"edf292dd-e2f4-4e80-ab9a-3e548118489a\" (UID: \"edf292dd-e2f4-4e80-ab9a-3e548118489a\") " Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.376494 4693 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d096d1f-bf52-413c-9cb9-4c89179e5725-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.388648 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edf292dd-e2f4-4e80-ab9a-3e548118489a-kube-api-access-pz5xm" (OuterVolumeSpecName: "kube-api-access-pz5xm") pod "edf292dd-e2f4-4e80-ab9a-3e548118489a" (UID: "edf292dd-e2f4-4e80-ab9a-3e548118489a"). InnerVolumeSpecName "kube-api-access-pz5xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.392637 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edf292dd-e2f4-4e80-ab9a-3e548118489a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "edf292dd-e2f4-4e80-ab9a-3e548118489a" (UID: "edf292dd-e2f4-4e80-ab9a-3e548118489a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.394163 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edf292dd-e2f4-4e80-ab9a-3e548118489a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "edf292dd-e2f4-4e80-ab9a-3e548118489a" (UID: "edf292dd-e2f4-4e80-ab9a-3e548118489a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.396951 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-d6f6d8ffd-tv96v" podStartSLOduration=4.396928383 podStartE2EDuration="4.396928383s" podCreationTimestamp="2025-12-04 10:06:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:06:10.381198999 +0000 UTC m=+1416.278792752" watchObservedRunningTime="2025-12-04 10:06:10.396928383 +0000 UTC m=+1416.294522126" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.399524 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-scripts" (OuterVolumeSpecName: "scripts") pod "edf292dd-e2f4-4e80-ab9a-3e548118489a" (UID: "edf292dd-e2f4-4e80-ab9a-3e548118489a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.414816 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edf292dd-e2f4-4e80-ab9a-3e548118489a" (UID: "edf292dd-e2f4-4e80-ab9a-3e548118489a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.433262 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "edf292dd-e2f4-4e80-ab9a-3e548118489a" (UID: "edf292dd-e2f4-4e80-ab9a-3e548118489a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.442121 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-config-data" (OuterVolumeSpecName: "config-data") pod "edf292dd-e2f4-4e80-ab9a-3e548118489a" (UID: "edf292dd-e2f4-4e80-ab9a-3e548118489a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.451982 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.479376 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.479405 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.479657 4693 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edf292dd-e2f4-4e80-ab9a-3e548118489a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.479678 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.479692 4693 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edf292dd-e2f4-4e80-ab9a-3e548118489a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.479704 4693 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edf292dd-e2f4-4e80-ab9a-3e548118489a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.479716 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz5xm\" (UniqueName: \"kubernetes.io/projected/edf292dd-e2f4-4e80-ab9a-3e548118489a-kube-api-access-pz5xm\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.549535 4693 scope.go:117] "RemoveContainer" containerID="e98854c320adffc66ab9b3f27212c1efb5e2d439a4194f947da8e97aa18a8e59" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.587717 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7649787bc6-fddzc"] Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.593732 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7649787bc6-fddzc"] Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.639937 4693 scope.go:117] "RemoveContainer" containerID="03b0ae154874e27f4fd270f8d73e693e94e8793a55b2936871d9df8e22d541fb" Dec 04 10:06:10 crc kubenswrapper[4693]: I1204 10:06:10.674820 4693 scope.go:117] "RemoveContainer" containerID="abbc17a2851c37f8bc1cc8459cd7a0c1f319183cde18e70e698a76638d199c1c" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.120036 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5dcd5dc8-znmkm"] Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.358357 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.359930 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dcd5dc8-znmkm" event={"ID":"c8aed54b-9500-4b6d-a966-64fb3cff7b45","Type":"ContainerStarted","Data":"b4a5a65ce337498c9e0e2af252021725c35c549b951716313fbbb1f34a847f38"} Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.461592 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.497386 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.526078 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:06:11 crc kubenswrapper[4693]: E1204 10:06:11.526657 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edf292dd-e2f4-4e80-ab9a-3e548118489a" containerName="sg-core" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.526678 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="edf292dd-e2f4-4e80-ab9a-3e548118489a" containerName="sg-core" Dec 04 10:06:11 crc kubenswrapper[4693]: E1204 10:06:11.526692 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d096d1f-bf52-413c-9cb9-4c89179e5725" containerName="horizon-log" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.526699 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d096d1f-bf52-413c-9cb9-4c89179e5725" containerName="horizon-log" Dec 04 10:06:11 crc kubenswrapper[4693]: E1204 10:06:11.526709 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d096d1f-bf52-413c-9cb9-4c89179e5725" containerName="horizon" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.526716 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d096d1f-bf52-413c-9cb9-4c89179e5725" containerName="horizon" Dec 04 10:06:11 crc kubenswrapper[4693]: E1204 10:06:11.526740 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edf292dd-e2f4-4e80-ab9a-3e548118489a" containerName="ceilometer-notification-agent" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.526746 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="edf292dd-e2f4-4e80-ab9a-3e548118489a" containerName="ceilometer-notification-agent" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.533092 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="edf292dd-e2f4-4e80-ab9a-3e548118489a" containerName="sg-core" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.533140 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="edf292dd-e2f4-4e80-ab9a-3e548118489a" containerName="ceilometer-notification-agent" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.533165 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d096d1f-bf52-413c-9cb9-4c89179e5725" containerName="horizon" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.533192 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d096d1f-bf52-413c-9cb9-4c89179e5725" containerName="horizon-log" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.563276 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.571117 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.571673 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.610915 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwqp6\" (UniqueName: \"kubernetes.io/projected/099d5d4d-d73b-442f-8210-d65c8e2a8317-kube-api-access-bwqp6\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.610986 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-scripts\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.611017 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-config-data\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.611045 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/099d5d4d-d73b-442f-8210-d65c8e2a8317-run-httpd\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.611067 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.611142 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.611176 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/099d5d4d-d73b-442f-8210-d65c8e2a8317-log-httpd\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.639547 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.713752 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwqp6\" (UniqueName: \"kubernetes.io/projected/099d5d4d-d73b-442f-8210-d65c8e2a8317-kube-api-access-bwqp6\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.713819 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-scripts\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.713850 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-config-data\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.713880 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/099d5d4d-d73b-442f-8210-d65c8e2a8317-run-httpd\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.713899 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.714188 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.714273 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/099d5d4d-d73b-442f-8210-d65c8e2a8317-log-httpd\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.715011 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/099d5d4d-d73b-442f-8210-d65c8e2a8317-log-httpd\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.715780 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/099d5d4d-d73b-442f-8210-d65c8e2a8317-run-httpd\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.723132 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.724282 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-scripts\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.732987 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-config-data\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.736762 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwqp6\" (UniqueName: \"kubernetes.io/projected/099d5d4d-d73b-442f-8210-d65c8e2a8317-kube-api-access-bwqp6\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.746624 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " pod="openstack/ceilometer-0" Dec 04 10:06:11 crc kubenswrapper[4693]: I1204 10:06:11.970349 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:06:12 crc kubenswrapper[4693]: I1204 10:06:12.373105 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dcd5dc8-znmkm" event={"ID":"c8aed54b-9500-4b6d-a966-64fb3cff7b45","Type":"ContainerStarted","Data":"71137da06363d3a383e439192bfdc6396653bd2cce0bb9fdb44d791cd9668806"} Dec 04 10:06:12 crc kubenswrapper[4693]: I1204 10:06:12.375243 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" event={"ID":"c964bd03-cc2a-461a-a3ad-3e8118ed8a82","Type":"ContainerStarted","Data":"8793797bd24fff2ea3923e004a7956356e142b031b98115da1dd1d989a19b44b"} Dec 04 10:06:12 crc kubenswrapper[4693]: I1204 10:06:12.407559 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" podStartSLOduration=6.4075342840000005 podStartE2EDuration="6.407534284s" podCreationTimestamp="2025-12-04 10:06:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:06:12.398996499 +0000 UTC m=+1418.296590252" watchObservedRunningTime="2025-12-04 10:06:12.407534284 +0000 UTC m=+1418.305128037" Dec 04 10:06:12 crc kubenswrapper[4693]: I1204 10:06:12.479865 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d096d1f-bf52-413c-9cb9-4c89179e5725" path="/var/lib/kubelet/pods/5d096d1f-bf52-413c-9cb9-4c89179e5725/volumes" Dec 04 10:06:12 crc kubenswrapper[4693]: I1204 10:06:12.480617 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edf292dd-e2f4-4e80-ab9a-3e548118489a" path="/var/lib/kubelet/pods/edf292dd-e2f4-4e80-ab9a-3e548118489a/volumes" Dec 04 10:06:12 crc kubenswrapper[4693]: I1204 10:06:12.481475 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:06:12 crc kubenswrapper[4693]: W1204 10:06:12.842129 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod099d5d4d_d73b_442f_8210_d65c8e2a8317.slice/crio-8b9825a003f2852c43372176787479ff059af33d693c842dccdfb1b481209b76 WatchSource:0}: Error finding container 8b9825a003f2852c43372176787479ff059af33d693c842dccdfb1b481209b76: Status 404 returned error can't find the container with id 8b9825a003f2852c43372176787479ff059af33d693c842dccdfb1b481209b76 Dec 04 10:06:13 crc kubenswrapper[4693]: I1204 10:06:13.387405 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"099d5d4d-d73b-442f-8210-d65c8e2a8317","Type":"ContainerStarted","Data":"8b9825a003f2852c43372176787479ff059af33d693c842dccdfb1b481209b76"} Dec 04 10:06:13 crc kubenswrapper[4693]: I1204 10:06:13.387714 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.249547 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-57857fb86f-8m84s"] Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.251529 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.254552 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.255128 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.257789 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.261501 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-log-httpd\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.261597 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-run-httpd\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.261656 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-etc-swift\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.261693 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-config-data\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.261758 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdsx2\" (UniqueName: \"kubernetes.io/projected/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-kube-api-access-mdsx2\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.261791 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-combined-ca-bundle\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.261814 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-public-tls-certs\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.261834 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-internal-tls-certs\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.273543 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-57857fb86f-8m84s"] Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.364111 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-internal-tls-certs\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.364262 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-log-httpd\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.364445 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-run-httpd\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.364512 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-etc-swift\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.364905 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-config-data\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.365034 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdsx2\" (UniqueName: \"kubernetes.io/projected/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-kube-api-access-mdsx2\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.365057 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-combined-ca-bundle\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.365086 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-public-tls-certs\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.364930 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-log-httpd\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.364970 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-run-httpd\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.383568 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-combined-ca-bundle\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.383570 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-internal-tls-certs\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.383761 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-config-data\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.384099 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-public-tls-certs\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.387098 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-etc-swift\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.392105 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdsx2\" (UniqueName: \"kubernetes.io/projected/3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a-kube-api-access-mdsx2\") pod \"swift-proxy-57857fb86f-8m84s\" (UID: \"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a\") " pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.557581 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.580999 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.633518 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:20 crc kubenswrapper[4693]: I1204 10:06:20.732514 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:06:21 crc kubenswrapper[4693]: I1204 10:06:21.892002 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:21 crc kubenswrapper[4693]: I1204 10:06:21.969889 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-lsdbk"] Dec 04 10:06:21 crc kubenswrapper[4693]: I1204 10:06:21.970130 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" podUID="9c1e1a17-c252-42be-af6b-d6a3d0cacf8c" containerName="dnsmasq-dns" containerID="cri-o://c911e4ffb75a2ebfcbdfc785b7a48cc5605208fd6d0a663239513b8b5cb58d89" gracePeriod=10 Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.230010 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-57857fb86f-8m84s"] Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.272413 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.272455 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.580655 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dcd5dc8-znmkm" event={"ID":"c8aed54b-9500-4b6d-a966-64fb3cff7b45","Type":"ContainerStarted","Data":"3e5b07f4ee204f9a46d7911b0f4e7c3e0d446ed364cc837f56b585d2043761f9"} Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.581743 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.582031 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.583613 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5dcd5dc8-znmkm" podUID="c8aed54b-9500-4b6d-a966-64fb3cff7b45" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": dial tcp 10.217.0.164:9311: connect: connection refused" Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.602556 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" event={"ID":"33e00b6b-bd3b-4198-8333-1515f919cbfc","Type":"ContainerStarted","Data":"9d084557014d7e4d44c24497a0b31f20289f42e3e0f8a9522035182f34cc09b9"} Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.606394 4693 generic.go:334] "Generic (PLEG): container finished" podID="9c1e1a17-c252-42be-af6b-d6a3d0cacf8c" containerID="c911e4ffb75a2ebfcbdfc785b7a48cc5605208fd6d0a663239513b8b5cb58d89" exitCode=0 Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.606488 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" event={"ID":"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c","Type":"ContainerDied","Data":"c911e4ffb75a2ebfcbdfc785b7a48cc5605208fd6d0a663239513b8b5cb58d89"} Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.622218 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5dcd5dc8-znmkm" podStartSLOduration=12.622199791 podStartE2EDuration="12.622199791s" podCreationTimestamp="2025-12-04 10:06:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:06:22.620614807 +0000 UTC m=+1428.518208590" watchObservedRunningTime="2025-12-04 10:06:22.622199791 +0000 UTC m=+1428.519793544" Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.633431 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-d7c5fcd89-hgn5p" event={"ID":"ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340","Type":"ContainerStarted","Data":"e1c9dd5fe474eca74d04a2a473b15e6abc451ed9f894abc1638d9c56f57dc15b"} Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.639930 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-57857fb86f-8m84s" event={"ID":"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a","Type":"ContainerStarted","Data":"757fb0873a86546ead1dbdd05994a25f5cf8176da30cccfaa12e3ffafacd74e8"} Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.851086 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.950319 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-ovsdbserver-nb\") pod \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.950402 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54c66\" (UniqueName: \"kubernetes.io/projected/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-kube-api-access-54c66\") pod \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.950427 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-dns-swift-storage-0\") pod \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.950500 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-ovsdbserver-sb\") pod \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.950543 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-config\") pod \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " Dec 04 10:06:22 crc kubenswrapper[4693]: I1204 10:06:22.950568 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-dns-svc\") pod \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.000761 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-kube-api-access-54c66" (OuterVolumeSpecName: "kube-api-access-54c66") pod "9c1e1a17-c252-42be-af6b-d6a3d0cacf8c" (UID: "9c1e1a17-c252-42be-af6b-d6a3d0cacf8c"). InnerVolumeSpecName "kube-api-access-54c66". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.053318 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54c66\" (UniqueName: \"kubernetes.io/projected/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-kube-api-access-54c66\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.121391 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9c1e1a17-c252-42be-af6b-d6a3d0cacf8c" (UID: "9c1e1a17-c252-42be-af6b-d6a3d0cacf8c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.160368 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c1e1a17-c252-42be-af6b-d6a3d0cacf8c" (UID: "9c1e1a17-c252-42be-af6b-d6a3d0cacf8c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.162610 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9c1e1a17-c252-42be-af6b-d6a3d0cacf8c" (UID: "9c1e1a17-c252-42be-af6b-d6a3d0cacf8c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.167417 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-ovsdbserver-nb\") pod \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\" (UID: \"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c\") " Dec 04 10:06:23 crc kubenswrapper[4693]: W1204 10:06:23.168143 4693 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c/volumes/kubernetes.io~configmap/ovsdbserver-nb Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.168239 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9c1e1a17-c252-42be-af6b-d6a3d0cacf8c" (UID: "9c1e1a17-c252-42be-af6b-d6a3d0cacf8c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.168568 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.168594 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.168609 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.172705 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-config" (OuterVolumeSpecName: "config") pod "9c1e1a17-c252-42be-af6b-d6a3d0cacf8c" (UID: "9c1e1a17-c252-42be-af6b-d6a3d0cacf8c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.183553 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9c1e1a17-c252-42be-af6b-d6a3d0cacf8c" (UID: "9c1e1a17-c252-42be-af6b-d6a3d0cacf8c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.270760 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.270927 4693 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.653671 4693 generic.go:334] "Generic (PLEG): container finished" podID="832e603c-b695-442e-bcf6-fa322cfc1524" containerID="e1856c4bcc5f7d4dd767ab5d0ee7c26159e7fce3876db52d046b3e1b06a476cf" exitCode=0 Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.653769 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-q642m" event={"ID":"832e603c-b695-442e-bcf6-fa322cfc1524","Type":"ContainerDied","Data":"e1856c4bcc5f7d4dd767ab5d0ee7c26159e7fce3876db52d046b3e1b06a476cf"} Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.656348 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" event={"ID":"9c1e1a17-c252-42be-af6b-d6a3d0cacf8c","Type":"ContainerDied","Data":"849236c820a9a1307492e7599e96d082b828cc85c5b38a7ed042754a0b8a5fc5"} Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.656379 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-lsdbk" Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.656394 4693 scope.go:117] "RemoveContainer" containerID="c911e4ffb75a2ebfcbdfc785b7a48cc5605208fd6d0a663239513b8b5cb58d89" Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.665538 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b2fefcbd-5f7f-4544-8f03-49adbe23a11b","Type":"ContainerStarted","Data":"825c6e88d710fc0db9336015a7b663405a17cf201dd5980209d21f3efac4f107"} Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.669836 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-d7c5fcd89-hgn5p" event={"ID":"ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340","Type":"ContainerStarted","Data":"5e01d2906778e30537de22b22083d39cd28d0399c8c1a0ba00ebccb0bbc3cb13"} Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.679874 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-57857fb86f-8m84s" event={"ID":"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a","Type":"ContainerStarted","Data":"a376afc26464fc668a5ce5554dd53eeede2af3a550ba970c3e7a99ded365a29f"} Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.684299 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" event={"ID":"33e00b6b-bd3b-4198-8333-1515f919cbfc","Type":"ContainerStarted","Data":"a134c253b2cf75c3a9e32e567523d48c3e8510d8852960f442661947ea0ec07e"} Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.686034 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"099d5d4d-d73b-442f-8210-d65c8e2a8317","Type":"ContainerStarted","Data":"cb1c2b2a7ebfb705d9d4092b900bd3f05395c980257a42dec3bf73c1bdd47704"} Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.722534 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.434302458 podStartE2EDuration="20.722514735s" podCreationTimestamp="2025-12-04 10:06:03 +0000 UTC" firstStartedPulling="2025-12-04 10:06:04.234427921 +0000 UTC m=+1410.132021724" lastFinishedPulling="2025-12-04 10:06:21.522640248 +0000 UTC m=+1427.420234001" observedRunningTime="2025-12-04 10:06:23.699741197 +0000 UTC m=+1429.597334950" watchObservedRunningTime="2025-12-04 10:06:23.722514735 +0000 UTC m=+1429.620108488" Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.725902 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-lsdbk"] Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.733632 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-lsdbk"] Dec 04 10:06:23 crc kubenswrapper[4693]: I1204 10:06:23.815544 4693 scope.go:117] "RemoveContainer" containerID="bcf30f7573081bc537a4fb717a7daba2a42b91c9352ca3c8498d7b1be9ff6f2b" Dec 04 10:06:24 crc kubenswrapper[4693]: I1204 10:06:24.475460 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c1e1a17-c252-42be-af6b-d6a3d0cacf8c" path="/var/lib/kubelet/pods/9c1e1a17-c252-42be-af6b-d6a3d0cacf8c/volumes" Dec 04 10:06:24 crc kubenswrapper[4693]: I1204 10:06:24.726098 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-76bdf94f96-jnvk8" podStartSLOduration=4.805200656 podStartE2EDuration="18.726081816s" podCreationTimestamp="2025-12-04 10:06:06 +0000 UTC" firstStartedPulling="2025-12-04 10:06:07.603214128 +0000 UTC m=+1413.500807881" lastFinishedPulling="2025-12-04 10:06:21.524095288 +0000 UTC m=+1427.421689041" observedRunningTime="2025-12-04 10:06:24.723940356 +0000 UTC m=+1430.621534129" watchObservedRunningTime="2025-12-04 10:06:24.726081816 +0000 UTC m=+1430.623675569" Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.151866 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-q642m" Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.238032 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-db-sync-config-data\") pod \"832e603c-b695-442e-bcf6-fa322cfc1524\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.238139 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-combined-ca-bundle\") pod \"832e603c-b695-442e-bcf6-fa322cfc1524\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.238290 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-config-data\") pod \"832e603c-b695-442e-bcf6-fa322cfc1524\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.238411 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/832e603c-b695-442e-bcf6-fa322cfc1524-etc-machine-id\") pod \"832e603c-b695-442e-bcf6-fa322cfc1524\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.238437 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-scripts\") pod \"832e603c-b695-442e-bcf6-fa322cfc1524\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.238488 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbh6c\" (UniqueName: \"kubernetes.io/projected/832e603c-b695-442e-bcf6-fa322cfc1524-kube-api-access-nbh6c\") pod \"832e603c-b695-442e-bcf6-fa322cfc1524\" (UID: \"832e603c-b695-442e-bcf6-fa322cfc1524\") " Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.238680 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/832e603c-b695-442e-bcf6-fa322cfc1524-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "832e603c-b695-442e-bcf6-fa322cfc1524" (UID: "832e603c-b695-442e-bcf6-fa322cfc1524"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.239109 4693 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/832e603c-b695-442e-bcf6-fa322cfc1524-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.258457 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "832e603c-b695-442e-bcf6-fa322cfc1524" (UID: "832e603c-b695-442e-bcf6-fa322cfc1524"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.265526 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/832e603c-b695-442e-bcf6-fa322cfc1524-kube-api-access-nbh6c" (OuterVolumeSpecName: "kube-api-access-nbh6c") pod "832e603c-b695-442e-bcf6-fa322cfc1524" (UID: "832e603c-b695-442e-bcf6-fa322cfc1524"). InnerVolumeSpecName "kube-api-access-nbh6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.297553 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-scripts" (OuterVolumeSpecName: "scripts") pod "832e603c-b695-442e-bcf6-fa322cfc1524" (UID: "832e603c-b695-442e-bcf6-fa322cfc1524"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.334510 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-config-data" (OuterVolumeSpecName: "config-data") pod "832e603c-b695-442e-bcf6-fa322cfc1524" (UID: "832e603c-b695-442e-bcf6-fa322cfc1524"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.341515 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "832e603c-b695-442e-bcf6-fa322cfc1524" (UID: "832e603c-b695-442e-bcf6-fa322cfc1524"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.342458 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.342524 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.342541 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbh6c\" (UniqueName: \"kubernetes.io/projected/832e603c-b695-442e-bcf6-fa322cfc1524-kube-api-access-nbh6c\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.342560 4693 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.342574 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/832e603c-b695-442e-bcf6-fa322cfc1524-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.712200 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-57857fb86f-8m84s" event={"ID":"3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a","Type":"ContainerStarted","Data":"5dafb348c977134cfeff22f5ae1a462e78678da993925a63589b00bf30cce692"} Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.715084 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-q642m" Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.715135 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-q642m" event={"ID":"832e603c-b695-442e-bcf6-fa322cfc1524","Type":"ContainerDied","Data":"a7cd6b439a4228953c57b6b93122ffcf4dc2b462426e0856cccdc0f580ab0b13"} Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.715198 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7cd6b439a4228953c57b6b93122ffcf4dc2b462426e0856cccdc0f580ab0b13" Dec 04 10:06:25 crc kubenswrapper[4693]: I1204 10:06:25.737100 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-d7c5fcd89-hgn5p" podStartSLOduration=5.553843157 podStartE2EDuration="19.737072643s" podCreationTimestamp="2025-12-04 10:06:06 +0000 UTC" firstStartedPulling="2025-12-04 10:06:07.340293437 +0000 UTC m=+1413.237887190" lastFinishedPulling="2025-12-04 10:06:21.523522923 +0000 UTC m=+1427.421116676" observedRunningTime="2025-12-04 10:06:25.733031682 +0000 UTC m=+1431.630625445" watchObservedRunningTime="2025-12-04 10:06:25.737072643 +0000 UTC m=+1431.634666396" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.553452 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-q7wmt"] Dec 04 10:06:26 crc kubenswrapper[4693]: E1204 10:06:26.554860 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c1e1a17-c252-42be-af6b-d6a3d0cacf8c" containerName="dnsmasq-dns" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.554886 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c1e1a17-c252-42be-af6b-d6a3d0cacf8c" containerName="dnsmasq-dns" Dec 04 10:06:26 crc kubenswrapper[4693]: E1204 10:06:26.554920 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="832e603c-b695-442e-bcf6-fa322cfc1524" containerName="cinder-db-sync" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.554934 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="832e603c-b695-442e-bcf6-fa322cfc1524" containerName="cinder-db-sync" Dec 04 10:06:26 crc kubenswrapper[4693]: E1204 10:06:26.554955 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c1e1a17-c252-42be-af6b-d6a3d0cacf8c" containerName="init" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.554962 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c1e1a17-c252-42be-af6b-d6a3d0cacf8c" containerName="init" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.555161 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="832e603c-b695-442e-bcf6-fa322cfc1524" containerName="cinder-db-sync" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.555200 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c1e1a17-c252-42be-af6b-d6a3d0cacf8c" containerName="dnsmasq-dns" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.556319 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.575513 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-q7wmt"] Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.679763 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-q7wmt\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.679829 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-dns-svc\") pod \"dnsmasq-dns-6578955fd5-q7wmt\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.679899 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-q7wmt\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.679933 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn9kl\" (UniqueName: \"kubernetes.io/projected/dc52ec7b-620c-419c-9f73-792c1b0c638f-kube-api-access-zn9kl\") pod \"dnsmasq-dns-6578955fd5-q7wmt\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.679984 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-config\") pod \"dnsmasq-dns-6578955fd5-q7wmt\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.680080 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-q7wmt\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.686399 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.688752 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.691719 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.692004 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fg25t" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.692136 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.693940 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.694185 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.696253 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.706613 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.708462 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.716283 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.717421 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.729133 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.784435 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.784498 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-config\") pod \"dnsmasq-dns-6578955fd5-q7wmt\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.784557 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.784586 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.784606 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.784625 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.784651 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.784669 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.784694 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-run\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.784941 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.784994 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-lib-modules\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786116 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-config\") pod \"dnsmasq-dns-6578955fd5-q7wmt\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786273 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctnvk\" (UniqueName: \"kubernetes.io/projected/87f05066-df1a-4762-a093-fb4485e060f7-kube-api-access-ctnvk\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786317 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4c7b\" (UniqueName: \"kubernetes.io/projected/94261d53-a51c-4f5b-b896-87a957c93c86-kube-api-access-r4c7b\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786409 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786441 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-scripts\") pod \"cinder-scheduler-0\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786501 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-dev\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786545 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786564 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-config-data\") pod \"cinder-scheduler-0\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786593 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786611 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-sys\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786647 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786679 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-q7wmt\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786738 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-scripts\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786762 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-sys\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786785 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-dev\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786804 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786847 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786888 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-q7wmt\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786921 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-dns-svc\") pod \"dnsmasq-dns-6578955fd5-q7wmt\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.786996 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-etc-nvme\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.787023 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.787049 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-config-data-custom\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.787081 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28d812a2-263f-491a-8804-94ab52f3c3c7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.787111 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.787149 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-q7wmt\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.787165 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.787868 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-dns-svc\") pod \"dnsmasq-dns-6578955fd5-q7wmt\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.788635 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-q7wmt\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.789373 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/87f05066-df1a-4762-a093-fb4485e060f7-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.789476 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.789502 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.789553 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn9kl\" (UniqueName: \"kubernetes.io/projected/dc52ec7b-620c-419c-9f73-792c1b0c638f-kube-api-access-zn9kl\") pod \"dnsmasq-dns-6578955fd5-q7wmt\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.789623 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-run\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.789643 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/94261d53-a51c-4f5b-b896-87a957c93c86-ceph\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.789685 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gslmg\" (UniqueName: \"kubernetes.io/projected/28d812a2-263f-491a-8804-94ab52f3c3c7-kube-api-access-gslmg\") pod \"cinder-scheduler-0\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.789732 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-config-data\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.797741 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-q7wmt\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.802469 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-q7wmt\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.819047 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"099d5d4d-d73b-442f-8210-d65c8e2a8317","Type":"ContainerStarted","Data":"a213ec1caa55e687834ca38482091cc97cc8a8a46e7b7c50738ec3d3084968c6"} Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.821144 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.835490 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.829478 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn9kl\" (UniqueName: \"kubernetes.io/projected/dc52ec7b-620c-419c-9f73-792c1b0c638f-kube-api-access-zn9kl\") pod \"dnsmasq-dns-6578955fd5-q7wmt\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.881387 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.892849 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-etc-nvme\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.892896 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.892916 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-config-data-custom\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.892939 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28d812a2-263f-491a-8804-94ab52f3c3c7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.892957 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.892977 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.892999 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/87f05066-df1a-4762-a093-fb4485e060f7-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893032 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893049 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893072 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/94261d53-a51c-4f5b-b896-87a957c93c86-ceph\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893086 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-run\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893103 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gslmg\" (UniqueName: \"kubernetes.io/projected/28d812a2-263f-491a-8804-94ab52f3c3c7-kube-api-access-gslmg\") pod \"cinder-scheduler-0\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893122 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-config-data\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893148 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893171 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893186 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893201 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893215 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893233 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893248 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893262 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-run\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893282 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893297 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-lib-modules\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893319 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctnvk\" (UniqueName: \"kubernetes.io/projected/87f05066-df1a-4762-a093-fb4485e060f7-kube-api-access-ctnvk\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893358 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4c7b\" (UniqueName: \"kubernetes.io/projected/94261d53-a51c-4f5b-b896-87a957c93c86-kube-api-access-r4c7b\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893386 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893406 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-scripts\") pod \"cinder-scheduler-0\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893432 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-dev\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893451 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893466 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-config-data\") pod \"cinder-scheduler-0\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893485 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893501 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-sys\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893523 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893544 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-scripts\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893558 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-sys\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893573 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-dev\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893588 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893610 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893719 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893880 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-etc-nvme\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.893941 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.903815 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.904442 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/87f05066-df1a-4762-a093-fb4485e060f7-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.904484 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.904573 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28d812a2-263f-491a-8804-94ab52f3c3c7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.904610 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.909637 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.909798 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.909875 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.910085 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.910229 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.910291 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.910315 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.910435 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.910472 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-run\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.911697 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.913348 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-sys\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.913414 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-dev\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.914039 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-sys\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.914689 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-scripts\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.914876 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.914986 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-dev\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.915071 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-run\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.915426 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-lib-modules\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.915527 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-config-data\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.915740 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-scripts\") pod \"cinder-scheduler-0\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.916070 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-config-data-custom\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.916815 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.923151 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-config-data\") pod \"cinder-scheduler-0\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.930299 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gslmg\" (UniqueName: \"kubernetes.io/projected/28d812a2-263f-491a-8804-94ab52f3c3c7-kube-api-access-gslmg\") pod \"cinder-scheduler-0\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.932016 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.932112 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.932205 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/94261d53-a51c-4f5b-b896-87a957c93c86-ceph\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.944185 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctnvk\" (UniqueName: \"kubernetes.io/projected/87f05066-df1a-4762-a093-fb4485e060f7-kube-api-access-ctnvk\") pod \"cinder-volume-volume1-0\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.944597 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4c7b\" (UniqueName: \"kubernetes.io/projected/94261d53-a51c-4f5b-b896-87a957c93c86-kube-api-access-r4c7b\") pod \"cinder-backup-0\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " pod="openstack/cinder-backup-0" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.951884 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:26 crc kubenswrapper[4693]: I1204 10:06:26.970002 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.012763 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.037434 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.060407 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.158953 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.162921 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.165667 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.173267 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.210688 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/066e8361-fd94-4e1d-8727-25a9a432f8e0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.210793 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/066e8361-fd94-4e1d-8727-25a9a432f8e0-logs\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.210870 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-scripts\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.210988 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-config-data\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.211031 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8cqz\" (UniqueName: \"kubernetes.io/projected/066e8361-fd94-4e1d-8727-25a9a432f8e0-kube-api-access-k8cqz\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.211087 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-config-data-custom\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.211132 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.313116 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/066e8361-fd94-4e1d-8727-25a9a432f8e0-logs\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.313204 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-scripts\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.313273 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-config-data\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.313293 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8cqz\" (UniqueName: \"kubernetes.io/projected/066e8361-fd94-4e1d-8727-25a9a432f8e0-kube-api-access-k8cqz\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.313358 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-config-data-custom\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.313387 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.313406 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/066e8361-fd94-4e1d-8727-25a9a432f8e0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.313501 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/066e8361-fd94-4e1d-8727-25a9a432f8e0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.314224 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/066e8361-fd94-4e1d-8727-25a9a432f8e0-logs\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.339845 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-config-data-custom\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.339898 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.341106 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-scripts\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.348262 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-config-data\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.396865 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8cqz\" (UniqueName: \"kubernetes.io/projected/066e8361-fd94-4e1d-8727-25a9a432f8e0-kube-api-access-k8cqz\") pod \"cinder-api-0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.537177 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-q7wmt"] Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.577846 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.859980 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" event={"ID":"dc52ec7b-620c-419c-9f73-792c1b0c638f","Type":"ContainerStarted","Data":"f10347d472f0d077082116c20242809b6d05014160203a71b653c0c4901babc1"} Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.860066 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.860103 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:27 crc kubenswrapper[4693]: I1204 10:06:27.895814 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-57857fb86f-8m84s" podStartSLOduration=7.895797144 podStartE2EDuration="7.895797144s" podCreationTimestamp="2025-12-04 10:06:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:06:27.894242912 +0000 UTC m=+1433.791836665" watchObservedRunningTime="2025-12-04 10:06:27.895797144 +0000 UTC m=+1433.793390897" Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.013052 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.244682 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.423502 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-tjkb9"] Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.424683 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tjkb9" Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.442416 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tjkb9"] Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.512965 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.563999 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-4252s"] Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.574630 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsn7j\" (UniqueName: \"kubernetes.io/projected/8e822040-7ff2-49be-bd68-70dc15db9ff9-kube-api-access-hsn7j\") pod \"nova-api-db-create-tjkb9\" (UID: \"8e822040-7ff2-49be-bd68-70dc15db9ff9\") " pod="openstack/nova-api-db-create-tjkb9" Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.575106 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e822040-7ff2-49be-bd68-70dc15db9ff9-operator-scripts\") pod \"nova-api-db-create-tjkb9\" (UID: \"8e822040-7ff2-49be-bd68-70dc15db9ff9\") " pod="openstack/nova-api-db-create-tjkb9" Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.583076 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4252s" Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.620396 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.657160 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4252s"] Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.681787 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00409360-0d5e-4451-b83d-84fbbf011c66-operator-scripts\") pod \"nova-cell0-db-create-4252s\" (UID: \"00409360-0d5e-4451-b83d-84fbbf011c66\") " pod="openstack/nova-cell0-db-create-4252s" Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.681857 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsn7j\" (UniqueName: \"kubernetes.io/projected/8e822040-7ff2-49be-bd68-70dc15db9ff9-kube-api-access-hsn7j\") pod \"nova-api-db-create-tjkb9\" (UID: \"8e822040-7ff2-49be-bd68-70dc15db9ff9\") " pod="openstack/nova-api-db-create-tjkb9" Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.681908 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xdkb\" (UniqueName: \"kubernetes.io/projected/00409360-0d5e-4451-b83d-84fbbf011c66-kube-api-access-6xdkb\") pod \"nova-cell0-db-create-4252s\" (UID: \"00409360-0d5e-4451-b83d-84fbbf011c66\") " pod="openstack/nova-cell0-db-create-4252s" Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.682006 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e822040-7ff2-49be-bd68-70dc15db9ff9-operator-scripts\") pod \"nova-api-db-create-tjkb9\" (UID: \"8e822040-7ff2-49be-bd68-70dc15db9ff9\") " pod="openstack/nova-api-db-create-tjkb9" Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.682745 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e822040-7ff2-49be-bd68-70dc15db9ff9-operator-scripts\") pod \"nova-api-db-create-tjkb9\" (UID: \"8e822040-7ff2-49be-bd68-70dc15db9ff9\") " pod="openstack/nova-api-db-create-tjkb9" Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.725843 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsn7j\" (UniqueName: \"kubernetes.io/projected/8e822040-7ff2-49be-bd68-70dc15db9ff9-kube-api-access-hsn7j\") pod \"nova-api-db-create-tjkb9\" (UID: \"8e822040-7ff2-49be-bd68-70dc15db9ff9\") " pod="openstack/nova-api-db-create-tjkb9" Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.772408 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-hllxc"] Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.801423 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hllxc" Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.803078 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tjkb9" Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.843350 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00409360-0d5e-4451-b83d-84fbbf011c66-operator-scripts\") pod \"nova-cell0-db-create-4252s\" (UID: \"00409360-0d5e-4451-b83d-84fbbf011c66\") " pod="openstack/nova-cell0-db-create-4252s" Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.843482 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xdkb\" (UniqueName: \"kubernetes.io/projected/00409360-0d5e-4451-b83d-84fbbf011c66-kube-api-access-6xdkb\") pod \"nova-cell0-db-create-4252s\" (UID: \"00409360-0d5e-4451-b83d-84fbbf011c66\") " pod="openstack/nova-cell0-db-create-4252s" Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.845319 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00409360-0d5e-4451-b83d-84fbbf011c66-operator-scripts\") pod \"nova-cell0-db-create-4252s\" (UID: \"00409360-0d5e-4451-b83d-84fbbf011c66\") " pod="openstack/nova-cell0-db-create-4252s" Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.918628 4693 generic.go:334] "Generic (PLEG): container finished" podID="dc52ec7b-620c-419c-9f73-792c1b0c638f" containerID="5f5c8b4f74a2cc0d36ec7b03d3f5aaf18b4a485ca82723009acadea8eb176284" exitCode=0 Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.918724 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" event={"ID":"dc52ec7b-620c-419c-9f73-792c1b0c638f","Type":"ContainerDied","Data":"5f5c8b4f74a2cc0d36ec7b03d3f5aaf18b4a485ca82723009acadea8eb176284"} Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.925008 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hllxc"] Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.940195 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xdkb\" (UniqueName: \"kubernetes.io/projected/00409360-0d5e-4451-b83d-84fbbf011c66-kube-api-access-6xdkb\") pod \"nova-cell0-db-create-4252s\" (UID: \"00409360-0d5e-4451-b83d-84fbbf011c66\") " pod="openstack/nova-cell0-db-create-4252s" Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.958116 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjfxt\" (UniqueName: \"kubernetes.io/projected/430ee024-3291-4a26-865f-4d1300bf5ea9-kube-api-access-sjfxt\") pod \"nova-cell1-db-create-hllxc\" (UID: \"430ee024-3291-4a26-865f-4d1300bf5ea9\") " pod="openstack/nova-cell1-db-create-hllxc" Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.958846 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/430ee024-3291-4a26-865f-4d1300bf5ea9-operator-scripts\") pod \"nova-cell1-db-create-hllxc\" (UID: \"430ee024-3291-4a26-865f-4d1300bf5ea9\") " pod="openstack/nova-cell1-db-create-hllxc" Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.975326 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"94261d53-a51c-4f5b-b896-87a957c93c86","Type":"ContainerStarted","Data":"0ba7de7b0cade5d2d370d91c506946c78a9f8a8ff167f7846320acbbedf2e9a9"} Dec 04 10:06:28 crc kubenswrapper[4693]: I1204 10:06:28.983225 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"87f05066-df1a-4762-a093-fb4485e060f7","Type":"ContainerStarted","Data":"811a1e106b95c67b632e7e7fb20e5b023bdaca7fa2019a8c3e486316e607a724"} Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.003582 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4252s" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.004987 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-752a-account-create-update-xdxqj"] Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.036643 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"066e8361-fd94-4e1d-8727-25a9a432f8e0","Type":"ContainerStarted","Data":"3f91298bff0ed46550b19fdbf5e9a167e86b5b56c0cee309b7482a9ef4a1bcec"} Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.036819 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-752a-account-create-update-xdxqj" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.047698 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"28d812a2-263f-491a-8804-94ab52f3c3c7","Type":"ContainerStarted","Data":"580f7b6b02054c94c6d9446f518b09b822a88a1dc47d5a2e4cacbc06d5a28789"} Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.047838 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-752a-account-create-update-xdxqj"] Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.082844 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.087318 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjfxt\" (UniqueName: \"kubernetes.io/projected/430ee024-3291-4a26-865f-4d1300bf5ea9-kube-api-access-sjfxt\") pod \"nova-cell1-db-create-hllxc\" (UID: \"430ee024-3291-4a26-865f-4d1300bf5ea9\") " pod="openstack/nova-cell1-db-create-hllxc" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.087495 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/430ee024-3291-4a26-865f-4d1300bf5ea9-operator-scripts\") pod \"nova-cell1-db-create-hllxc\" (UID: \"430ee024-3291-4a26-865f-4d1300bf5ea9\") " pod="openstack/nova-cell1-db-create-hllxc" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.088736 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/430ee024-3291-4a26-865f-4d1300bf5ea9-operator-scripts\") pod \"nova-cell1-db-create-hllxc\" (UID: \"430ee024-3291-4a26-865f-4d1300bf5ea9\") " pod="openstack/nova-cell1-db-create-hllxc" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.133597 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjfxt\" (UniqueName: \"kubernetes.io/projected/430ee024-3291-4a26-865f-4d1300bf5ea9-kube-api-access-sjfxt\") pod \"nova-cell1-db-create-hllxc\" (UID: \"430ee024-3291-4a26-865f-4d1300bf5ea9\") " pod="openstack/nova-cell1-db-create-hllxc" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.164415 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hllxc" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.168369 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-0a0a-account-create-update-5cbb7"] Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.169925 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0a0a-account-create-update-5cbb7" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.182526 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.187545 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-0a0a-account-create-update-5cbb7"] Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.189034 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4blf\" (UniqueName: \"kubernetes.io/projected/a0dd0393-4f73-4cbd-be2a-4b4471ea154c-kube-api-access-k4blf\") pod \"nova-api-752a-account-create-update-xdxqj\" (UID: \"a0dd0393-4f73-4cbd-be2a-4b4471ea154c\") " pod="openstack/nova-api-752a-account-create-update-xdxqj" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.189188 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0dd0393-4f73-4cbd-be2a-4b4471ea154c-operator-scripts\") pod \"nova-api-752a-account-create-update-xdxqj\" (UID: \"a0dd0393-4f73-4cbd-be2a-4b4471ea154c\") " pod="openstack/nova-api-752a-account-create-update-xdxqj" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.237153 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-f3cd-account-create-update-5sv79"] Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.238353 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f3cd-account-create-update-5sv79" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.247684 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.248200 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f3cd-account-create-update-5sv79"] Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.303058 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e19a65e-3f81-4663-8041-8f2186d3d6c2-operator-scripts\") pod \"nova-cell0-0a0a-account-create-update-5cbb7\" (UID: \"1e19a65e-3f81-4663-8041-8f2186d3d6c2\") " pod="openstack/nova-cell0-0a0a-account-create-update-5cbb7" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.303130 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0dd0393-4f73-4cbd-be2a-4b4471ea154c-operator-scripts\") pod \"nova-api-752a-account-create-update-xdxqj\" (UID: \"a0dd0393-4f73-4cbd-be2a-4b4471ea154c\") " pod="openstack/nova-api-752a-account-create-update-xdxqj" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.303208 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gnlj\" (UniqueName: \"kubernetes.io/projected/1e19a65e-3f81-4663-8041-8f2186d3d6c2-kube-api-access-6gnlj\") pod \"nova-cell0-0a0a-account-create-update-5cbb7\" (UID: \"1e19a65e-3f81-4663-8041-8f2186d3d6c2\") " pod="openstack/nova-cell0-0a0a-account-create-update-5cbb7" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.303261 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4blf\" (UniqueName: \"kubernetes.io/projected/a0dd0393-4f73-4cbd-be2a-4b4471ea154c-kube-api-access-k4blf\") pod \"nova-api-752a-account-create-update-xdxqj\" (UID: \"a0dd0393-4f73-4cbd-be2a-4b4471ea154c\") " pod="openstack/nova-api-752a-account-create-update-xdxqj" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.304475 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0dd0393-4f73-4cbd-be2a-4b4471ea154c-operator-scripts\") pod \"nova-api-752a-account-create-update-xdxqj\" (UID: \"a0dd0393-4f73-4cbd-be2a-4b4471ea154c\") " pod="openstack/nova-api-752a-account-create-update-xdxqj" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.328739 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4blf\" (UniqueName: \"kubernetes.io/projected/a0dd0393-4f73-4cbd-be2a-4b4471ea154c-kube-api-access-k4blf\") pod \"nova-api-752a-account-create-update-xdxqj\" (UID: \"a0dd0393-4f73-4cbd-be2a-4b4471ea154c\") " pod="openstack/nova-api-752a-account-create-update-xdxqj" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.354142 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.416838 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e19a65e-3f81-4663-8041-8f2186d3d6c2-operator-scripts\") pod \"nova-cell0-0a0a-account-create-update-5cbb7\" (UID: \"1e19a65e-3f81-4663-8041-8f2186d3d6c2\") " pod="openstack/nova-cell0-0a0a-account-create-update-5cbb7" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.417640 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e19a65e-3f81-4663-8041-8f2186d3d6c2-operator-scripts\") pod \"nova-cell0-0a0a-account-create-update-5cbb7\" (UID: \"1e19a65e-3f81-4663-8041-8f2186d3d6c2\") " pod="openstack/nova-cell0-0a0a-account-create-update-5cbb7" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.417893 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0805403f-8f31-4183-a1b6-d1eedcb64a8d-operator-scripts\") pod \"nova-cell1-f3cd-account-create-update-5sv79\" (UID: \"0805403f-8f31-4183-a1b6-d1eedcb64a8d\") " pod="openstack/nova-cell1-f3cd-account-create-update-5sv79" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.417957 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gnlj\" (UniqueName: \"kubernetes.io/projected/1e19a65e-3f81-4663-8041-8f2186d3d6c2-kube-api-access-6gnlj\") pod \"nova-cell0-0a0a-account-create-update-5cbb7\" (UID: \"1e19a65e-3f81-4663-8041-8f2186d3d6c2\") " pod="openstack/nova-cell0-0a0a-account-create-update-5cbb7" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.418087 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2ghd\" (UniqueName: \"kubernetes.io/projected/0805403f-8f31-4183-a1b6-d1eedcb64a8d-kube-api-access-q2ghd\") pod \"nova-cell1-f3cd-account-create-update-5sv79\" (UID: \"0805403f-8f31-4183-a1b6-d1eedcb64a8d\") " pod="openstack/nova-cell1-f3cd-account-create-update-5sv79" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.440538 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-752a-account-create-update-xdxqj" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.445706 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gnlj\" (UniqueName: \"kubernetes.io/projected/1e19a65e-3f81-4663-8041-8f2186d3d6c2-kube-api-access-6gnlj\") pod \"nova-cell0-0a0a-account-create-update-5cbb7\" (UID: \"1e19a65e-3f81-4663-8041-8f2186d3d6c2\") " pod="openstack/nova-cell0-0a0a-account-create-update-5cbb7" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.512766 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0a0a-account-create-update-5cbb7" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.524157 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2ghd\" (UniqueName: \"kubernetes.io/projected/0805403f-8f31-4183-a1b6-d1eedcb64a8d-kube-api-access-q2ghd\") pod \"nova-cell1-f3cd-account-create-update-5sv79\" (UID: \"0805403f-8f31-4183-a1b6-d1eedcb64a8d\") " pod="openstack/nova-cell1-f3cd-account-create-update-5sv79" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.524286 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0805403f-8f31-4183-a1b6-d1eedcb64a8d-operator-scripts\") pod \"nova-cell1-f3cd-account-create-update-5sv79\" (UID: \"0805403f-8f31-4183-a1b6-d1eedcb64a8d\") " pod="openstack/nova-cell1-f3cd-account-create-update-5sv79" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.525016 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0805403f-8f31-4183-a1b6-d1eedcb64a8d-operator-scripts\") pod \"nova-cell1-f3cd-account-create-update-5sv79\" (UID: \"0805403f-8f31-4183-a1b6-d1eedcb64a8d\") " pod="openstack/nova-cell1-f3cd-account-create-update-5sv79" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.554685 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2ghd\" (UniqueName: \"kubernetes.io/projected/0805403f-8f31-4183-a1b6-d1eedcb64a8d-kube-api-access-q2ghd\") pod \"nova-cell1-f3cd-account-create-update-5sv79\" (UID: \"0805403f-8f31-4183-a1b6-d1eedcb64a8d\") " pod="openstack/nova-cell1-f3cd-account-create-update-5sv79" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.740159 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f3cd-account-create-update-5sv79" Dec 04 10:06:29 crc kubenswrapper[4693]: I1204 10:06:29.952524 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tjkb9"] Dec 04 10:06:30 crc kubenswrapper[4693]: I1204 10:06:30.074924 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"099d5d4d-d73b-442f-8210-d65c8e2a8317","Type":"ContainerStarted","Data":"1e6df36b51fb539886f770c28a8e725218ee29df6695ef24b218174a1e168edf"} Dec 04 10:06:30 crc kubenswrapper[4693]: I1204 10:06:30.083205 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-57857fb86f-8m84s" Dec 04 10:06:30 crc kubenswrapper[4693]: I1204 10:06:30.202135 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hllxc"] Dec 04 10:06:30 crc kubenswrapper[4693]: I1204 10:06:30.337904 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4252s"] Dec 04 10:06:30 crc kubenswrapper[4693]: I1204 10:06:30.635031 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:06:30 crc kubenswrapper[4693]: W1204 10:06:30.922106 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e19a65e_3f81_4663_8041_8f2186d3d6c2.slice/crio-b66c5b881f9db468780feac914fdfb1daa5dbbc794f764ae5b23f827b42d6ff7 WatchSource:0}: Error finding container b66c5b881f9db468780feac914fdfb1daa5dbbc794f764ae5b23f827b42d6ff7: Status 404 returned error can't find the container with id b66c5b881f9db468780feac914fdfb1daa5dbbc794f764ae5b23f827b42d6ff7 Dec 04 10:06:31 crc kubenswrapper[4693]: I1204 10:06:31.061223 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-0a0a-account-create-update-5cbb7"] Dec 04 10:06:31 crc kubenswrapper[4693]: I1204 10:06:31.152227 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0a0a-account-create-update-5cbb7" event={"ID":"1e19a65e-3f81-4663-8041-8f2186d3d6c2","Type":"ContainerStarted","Data":"b66c5b881f9db468780feac914fdfb1daa5dbbc794f764ae5b23f827b42d6ff7"} Dec 04 10:06:31 crc kubenswrapper[4693]: I1204 10:06:31.179504 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4252s" event={"ID":"00409360-0d5e-4451-b83d-84fbbf011c66","Type":"ContainerStarted","Data":"9043856f03a25f833e89e8113898e562ee3e168e9c4a3a464c6449f22ddbfb92"} Dec 04 10:06:31 crc kubenswrapper[4693]: I1204 10:06:31.242379 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" event={"ID":"dc52ec7b-620c-419c-9f73-792c1b0c638f","Type":"ContainerStarted","Data":"5928fd66d610586f0a168154118d98bdfae1b512ee17ed81e204a48ddbd1f2f0"} Dec 04 10:06:31 crc kubenswrapper[4693]: I1204 10:06:31.244455 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:31 crc kubenswrapper[4693]: I1204 10:06:31.271108 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tjkb9" event={"ID":"8e822040-7ff2-49be-bd68-70dc15db9ff9","Type":"ContainerStarted","Data":"4b08a49d277eb92e7af42179c87b74838058955ac195aa9471c9f746f4344c3b"} Dec 04 10:06:31 crc kubenswrapper[4693]: I1204 10:06:31.291225 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"066e8361-fd94-4e1d-8727-25a9a432f8e0","Type":"ContainerStarted","Data":"0ce85455c29d0eaa365286afe37bf6f2fe92a7781e24d3d6ef3c781446c9dc5a"} Dec 04 10:06:31 crc kubenswrapper[4693]: I1204 10:06:31.371185 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hllxc" event={"ID":"430ee024-3291-4a26-865f-4d1300bf5ea9","Type":"ContainerStarted","Data":"26306bfb836e702e94c3e1ac81577838aa9db1fbb326f967020a17d9f060cabf"} Dec 04 10:06:31 crc kubenswrapper[4693]: I1204 10:06:31.395403 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" podStartSLOduration=5.395380821 podStartE2EDuration="5.395380821s" podCreationTimestamp="2025-12-04 10:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:06:31.284073528 +0000 UTC m=+1437.181667281" watchObservedRunningTime="2025-12-04 10:06:31.395380821 +0000 UTC m=+1437.292974594" Dec 04 10:06:31 crc kubenswrapper[4693]: I1204 10:06:31.536097 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-752a-account-create-update-xdxqj"] Dec 04 10:06:31 crc kubenswrapper[4693]: I1204 10:06:31.715373 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f3cd-account-create-update-5sv79"] Dec 04 10:06:32 crc kubenswrapper[4693]: I1204 10:06:32.432491 4693 generic.go:334] "Generic (PLEG): container finished" podID="00409360-0d5e-4451-b83d-84fbbf011c66" containerID="1130c7cc91ecaad62e289e83e7d312bb1bb1acedcefe385696f1aa2de6f6b2e2" exitCode=0 Dec 04 10:06:32 crc kubenswrapper[4693]: I1204 10:06:32.432713 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4252s" event={"ID":"00409360-0d5e-4451-b83d-84fbbf011c66","Type":"ContainerDied","Data":"1130c7cc91ecaad62e289e83e7d312bb1bb1acedcefe385696f1aa2de6f6b2e2"} Dec 04 10:06:32 crc kubenswrapper[4693]: I1204 10:06:32.443590 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-752a-account-create-update-xdxqj" event={"ID":"a0dd0393-4f73-4cbd-be2a-4b4471ea154c","Type":"ContainerStarted","Data":"57c0550fdc059f86a479d5764758fdf15374fd96b1331069bf1bb0674057b8b4"} Dec 04 10:06:32 crc kubenswrapper[4693]: I1204 10:06:32.447184 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"94261d53-a51c-4f5b-b896-87a957c93c86","Type":"ContainerStarted","Data":"b9283919629fe583cb70890e2f381584b6304f6d566c5af32dcc4d63ea2d2409"} Dec 04 10:06:32 crc kubenswrapper[4693]: I1204 10:06:32.458704 4693 generic.go:334] "Generic (PLEG): container finished" podID="8e822040-7ff2-49be-bd68-70dc15db9ff9" containerID="1403c2a6800e91b083b88e01018f1c82e58505ca5db9710670f9173d0b49fc74" exitCode=0 Dec 04 10:06:32 crc kubenswrapper[4693]: I1204 10:06:32.458802 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tjkb9" event={"ID":"8e822040-7ff2-49be-bd68-70dc15db9ff9","Type":"ContainerDied","Data":"1403c2a6800e91b083b88e01018f1c82e58505ca5db9710670f9173d0b49fc74"} Dec 04 10:06:32 crc kubenswrapper[4693]: I1204 10:06:32.495738 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"87f05066-df1a-4762-a093-fb4485e060f7","Type":"ContainerStarted","Data":"9e6a376c8dcf96734ed999bc5935e629a4692f67287dca9f4f2d00d0fa0ef7ec"} Dec 04 10:06:32 crc kubenswrapper[4693]: I1204 10:06:32.495919 4693 generic.go:334] "Generic (PLEG): container finished" podID="430ee024-3291-4a26-865f-4d1300bf5ea9" containerID="309b3a1bca1f57660263110cf5e57e4c7231c19b667ca928f625270e3c3e1eaa" exitCode=0 Dec 04 10:06:32 crc kubenswrapper[4693]: I1204 10:06:32.495971 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hllxc" event={"ID":"430ee024-3291-4a26-865f-4d1300bf5ea9","Type":"ContainerDied","Data":"309b3a1bca1f57660263110cf5e57e4c7231c19b667ca928f625270e3c3e1eaa"} Dec 04 10:06:32 crc kubenswrapper[4693]: I1204 10:06:32.503863 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f3cd-account-create-update-5sv79" event={"ID":"0805403f-8f31-4183-a1b6-d1eedcb64a8d","Type":"ContainerStarted","Data":"15aff1ca0da47a4c63e3bea4257ed243ec53650d4220aaec986b20d91630f5d5"} Dec 04 10:06:32 crc kubenswrapper[4693]: I1204 10:06:32.511626 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0a0a-account-create-update-5cbb7" event={"ID":"1e19a65e-3f81-4663-8041-8f2186d3d6c2","Type":"ContainerStarted","Data":"50639091b6440ac439ad39a9f535dc9133f75266600ec01466240f152f9a11af"} Dec 04 10:06:32 crc kubenswrapper[4693]: I1204 10:06:32.546689 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-0a0a-account-create-update-5cbb7" podStartSLOduration=4.546670052 podStartE2EDuration="4.546670052s" podCreationTimestamp="2025-12-04 10:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:06:32.53898925 +0000 UTC m=+1438.436583003" watchObservedRunningTime="2025-12-04 10:06:32.546670052 +0000 UTC m=+1438.444263805" Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.397093 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5dcd5dc8-znmkm" Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.507939 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d6f6d8ffd-tv96v"] Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.508241 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d6f6d8ffd-tv96v" podUID="35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9" containerName="barbican-api-log" containerID="cri-o://87cf49e446b0b4b93cd6728145979739363d1b230f474bc59c7efaa8099f23ae" gracePeriod=30 Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.508959 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d6f6d8ffd-tv96v" podUID="35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9" containerName="barbican-api" containerID="cri-o://db046315a0cfd86af952ec142e9da3383d728fd84157270abdca664980541dd9" gracePeriod=30 Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.558884 4693 generic.go:334] "Generic (PLEG): container finished" podID="0805403f-8f31-4183-a1b6-d1eedcb64a8d" containerID="095835de9d6a0d01ca21c9eb02b0bd85fe078f73384a7b8dbc9a177551b4bcb4" exitCode=0 Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.558945 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f3cd-account-create-update-5sv79" event={"ID":"0805403f-8f31-4183-a1b6-d1eedcb64a8d","Type":"ContainerDied","Data":"095835de9d6a0d01ca21c9eb02b0bd85fe078f73384a7b8dbc9a177551b4bcb4"} Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.578132 4693 generic.go:334] "Generic (PLEG): container finished" podID="1e19a65e-3f81-4663-8041-8f2186d3d6c2" containerID="50639091b6440ac439ad39a9f535dc9133f75266600ec01466240f152f9a11af" exitCode=0 Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.578324 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0a0a-account-create-update-5cbb7" event={"ID":"1e19a65e-3f81-4663-8041-8f2186d3d6c2","Type":"ContainerDied","Data":"50639091b6440ac439ad39a9f535dc9133f75266600ec01466240f152f9a11af"} Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.580726 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-752a-account-create-update-xdxqj" event={"ID":"a0dd0393-4f73-4cbd-be2a-4b4471ea154c","Type":"ContainerStarted","Data":"7a63ce2b45cffe3f9b5a685e4ac1159c6ded18084f9dae5f791bdd6fb3752530"} Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.586822 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"94261d53-a51c-4f5b-b896-87a957c93c86","Type":"ContainerStarted","Data":"7adf8d5297cf0bc240de53308667a77c92a8347958c34b7c249b89d79812196e"} Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.612949 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-752a-account-create-update-xdxqj" podStartSLOduration=5.612925336 podStartE2EDuration="5.612925336s" podCreationTimestamp="2025-12-04 10:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:06:33.608122794 +0000 UTC m=+1439.505716547" watchObservedRunningTime="2025-12-04 10:06:33.612925336 +0000 UTC m=+1439.510519099" Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.613654 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"87f05066-df1a-4762-a093-fb4485e060f7","Type":"ContainerStarted","Data":"f5ee56828eb494c4043cdb9064307baf016bd4406acabe7e6bb52a3e015d5774"} Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.623367 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="066e8361-fd94-4e1d-8727-25a9a432f8e0" containerName="cinder-api-log" containerID="cri-o://0ce85455c29d0eaa365286afe37bf6f2fe92a7781e24d3d6ef3c781446c9dc5a" gracePeriod=30 Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.623812 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"066e8361-fd94-4e1d-8727-25a9a432f8e0","Type":"ContainerStarted","Data":"e9f75cc688350b4628d50a2d8c4444af6e55ab3be187a5098e317699d2f37224"} Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.623860 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.623905 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="066e8361-fd94-4e1d-8727-25a9a432f8e0" containerName="cinder-api" containerID="cri-o://e9f75cc688350b4628d50a2d8c4444af6e55ab3be187a5098e317699d2f37224" gracePeriod=30 Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.683207 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"099d5d4d-d73b-442f-8210-d65c8e2a8317","Type":"ContainerStarted","Data":"31eeb9befd7aebe78701a71e335aeef29755fe447d9459ca11df2e55d62cfc27"} Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.683566 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerName="ceilometer-central-agent" containerID="cri-o://cb1c2b2a7ebfb705d9d4092b900bd3f05395c980257a42dec3bf73c1bdd47704" gracePeriod=30 Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.683869 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.683953 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerName="proxy-httpd" containerID="cri-o://31eeb9befd7aebe78701a71e335aeef29755fe447d9459ca11df2e55d62cfc27" gracePeriod=30 Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.684009 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerName="sg-core" containerID="cri-o://1e6df36b51fb539886f770c28a8e725218ee29df6695ef24b218174a1e168edf" gracePeriod=30 Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.684065 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerName="ceilometer-notification-agent" containerID="cri-o://a213ec1caa55e687834ca38482091cc97cc8a8a46e7b7c50738ec3d3084968c6" gracePeriod=30 Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.702913 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=5.505339667 podStartE2EDuration="7.70289064s" podCreationTimestamp="2025-12-04 10:06:26 +0000 UTC" firstStartedPulling="2025-12-04 10:06:28.574392932 +0000 UTC m=+1434.471986685" lastFinishedPulling="2025-12-04 10:06:30.771943905 +0000 UTC m=+1436.669537658" observedRunningTime="2025-12-04 10:06:33.675847224 +0000 UTC m=+1439.573440977" watchObservedRunningTime="2025-12-04 10:06:33.70289064 +0000 UTC m=+1439.600484393" Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.719428 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"28d812a2-263f-491a-8804-94ab52f3c3c7","Type":"ContainerStarted","Data":"b5dee9dd056623b3908ef81e03563ef30a66136a6b41eac5fdbfff9a8eb66d62"} Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.719515 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"28d812a2-263f-491a-8804-94ab52f3c3c7","Type":"ContainerStarted","Data":"3ff5fde1ea53bacd039524b19e581c1194ed38fa23ab5df3b223ae68087f4abe"} Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.783302 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=5.537794635 podStartE2EDuration="7.78326737s" podCreationTimestamp="2025-12-04 10:06:26 +0000 UTC" firstStartedPulling="2025-12-04 10:06:28.547997014 +0000 UTC m=+1434.445590767" lastFinishedPulling="2025-12-04 10:06:30.793469749 +0000 UTC m=+1436.691063502" observedRunningTime="2025-12-04 10:06:33.721664329 +0000 UTC m=+1439.619258082" watchObservedRunningTime="2025-12-04 10:06:33.78326737 +0000 UTC m=+1439.680861123" Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.890977 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.890947144 podStartE2EDuration="6.890947144s" podCreationTimestamp="2025-12-04 10:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:06:33.74996441 +0000 UTC m=+1439.647558163" watchObservedRunningTime="2025-12-04 10:06:33.890947144 +0000 UTC m=+1439.788540897" Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.898428 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.099391338 podStartE2EDuration="22.89840802s" podCreationTimestamp="2025-12-04 10:06:11 +0000 UTC" firstStartedPulling="2025-12-04 10:06:12.845526009 +0000 UTC m=+1418.743119762" lastFinishedPulling="2025-12-04 10:06:31.644542691 +0000 UTC m=+1437.542136444" observedRunningTime="2025-12-04 10:06:33.795040635 +0000 UTC m=+1439.692634388" watchObservedRunningTime="2025-12-04 10:06:33.89840802 +0000 UTC m=+1439.796001773" Dec 04 10:06:33 crc kubenswrapper[4693]: I1204 10:06:33.919114 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.291130118 podStartE2EDuration="7.919089951s" podCreationTimestamp="2025-12-04 10:06:26 +0000 UTC" firstStartedPulling="2025-12-04 10:06:28.027234494 +0000 UTC m=+1433.924828247" lastFinishedPulling="2025-12-04 10:06:29.655194327 +0000 UTC m=+1435.552788080" observedRunningTime="2025-12-04 10:06:33.828280643 +0000 UTC m=+1439.725874396" watchObservedRunningTime="2025-12-04 10:06:33.919089951 +0000 UTC m=+1439.816683694" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.634150 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4252s" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.647185 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hllxc" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.706529 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tjkb9" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.760631 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e822040-7ff2-49be-bd68-70dc15db9ff9-operator-scripts\") pod \"8e822040-7ff2-49be-bd68-70dc15db9ff9\" (UID: \"8e822040-7ff2-49be-bd68-70dc15db9ff9\") " Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.760702 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xdkb\" (UniqueName: \"kubernetes.io/projected/00409360-0d5e-4451-b83d-84fbbf011c66-kube-api-access-6xdkb\") pod \"00409360-0d5e-4451-b83d-84fbbf011c66\" (UID: \"00409360-0d5e-4451-b83d-84fbbf011c66\") " Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.760925 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00409360-0d5e-4451-b83d-84fbbf011c66-operator-scripts\") pod \"00409360-0d5e-4451-b83d-84fbbf011c66\" (UID: \"00409360-0d5e-4451-b83d-84fbbf011c66\") " Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.761046 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/430ee024-3291-4a26-865f-4d1300bf5ea9-operator-scripts\") pod \"430ee024-3291-4a26-865f-4d1300bf5ea9\" (UID: \"430ee024-3291-4a26-865f-4d1300bf5ea9\") " Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.761076 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsn7j\" (UniqueName: \"kubernetes.io/projected/8e822040-7ff2-49be-bd68-70dc15db9ff9-kube-api-access-hsn7j\") pod \"8e822040-7ff2-49be-bd68-70dc15db9ff9\" (UID: \"8e822040-7ff2-49be-bd68-70dc15db9ff9\") " Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.761116 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjfxt\" (UniqueName: \"kubernetes.io/projected/430ee024-3291-4a26-865f-4d1300bf5ea9-kube-api-access-sjfxt\") pod \"430ee024-3291-4a26-865f-4d1300bf5ea9\" (UID: \"430ee024-3291-4a26-865f-4d1300bf5ea9\") " Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.761978 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00409360-0d5e-4451-b83d-84fbbf011c66-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "00409360-0d5e-4451-b83d-84fbbf011c66" (UID: "00409360-0d5e-4451-b83d-84fbbf011c66"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.762498 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/430ee024-3291-4a26-865f-4d1300bf5ea9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "430ee024-3291-4a26-865f-4d1300bf5ea9" (UID: "430ee024-3291-4a26-865f-4d1300bf5ea9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.763083 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00409360-0d5e-4451-b83d-84fbbf011c66-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.763104 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/430ee024-3291-4a26-865f-4d1300bf5ea9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.763321 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e822040-7ff2-49be-bd68-70dc15db9ff9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8e822040-7ff2-49be-bd68-70dc15db9ff9" (UID: "8e822040-7ff2-49be-bd68-70dc15db9ff9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.771747 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00409360-0d5e-4451-b83d-84fbbf011c66-kube-api-access-6xdkb" (OuterVolumeSpecName: "kube-api-access-6xdkb") pod "00409360-0d5e-4451-b83d-84fbbf011c66" (UID: "00409360-0d5e-4451-b83d-84fbbf011c66"). InnerVolumeSpecName "kube-api-access-6xdkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.772649 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/430ee024-3291-4a26-865f-4d1300bf5ea9-kube-api-access-sjfxt" (OuterVolumeSpecName: "kube-api-access-sjfxt") pod "430ee024-3291-4a26-865f-4d1300bf5ea9" (UID: "430ee024-3291-4a26-865f-4d1300bf5ea9"). InnerVolumeSpecName "kube-api-access-sjfxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.776454 4693 generic.go:334] "Generic (PLEG): container finished" podID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerID="31eeb9befd7aebe78701a71e335aeef29755fe447d9459ca11df2e55d62cfc27" exitCode=0 Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.776502 4693 generic.go:334] "Generic (PLEG): container finished" podID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerID="1e6df36b51fb539886f770c28a8e725218ee29df6695ef24b218174a1e168edf" exitCode=2 Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.776513 4693 generic.go:334] "Generic (PLEG): container finished" podID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerID="a213ec1caa55e687834ca38482091cc97cc8a8a46e7b7c50738ec3d3084968c6" exitCode=0 Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.776522 4693 generic.go:334] "Generic (PLEG): container finished" podID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerID="cb1c2b2a7ebfb705d9d4092b900bd3f05395c980257a42dec3bf73c1bdd47704" exitCode=0 Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.776536 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"099d5d4d-d73b-442f-8210-d65c8e2a8317","Type":"ContainerDied","Data":"31eeb9befd7aebe78701a71e335aeef29755fe447d9459ca11df2e55d62cfc27"} Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.776681 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"099d5d4d-d73b-442f-8210-d65c8e2a8317","Type":"ContainerDied","Data":"1e6df36b51fb539886f770c28a8e725218ee29df6695ef24b218174a1e168edf"} Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.776699 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"099d5d4d-d73b-442f-8210-d65c8e2a8317","Type":"ContainerDied","Data":"a213ec1caa55e687834ca38482091cc97cc8a8a46e7b7c50738ec3d3084968c6"} Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.776710 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"099d5d4d-d73b-442f-8210-d65c8e2a8317","Type":"ContainerDied","Data":"cb1c2b2a7ebfb705d9d4092b900bd3f05395c980257a42dec3bf73c1bdd47704"} Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.778963 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e822040-7ff2-49be-bd68-70dc15db9ff9-kube-api-access-hsn7j" (OuterVolumeSpecName: "kube-api-access-hsn7j") pod "8e822040-7ff2-49be-bd68-70dc15db9ff9" (UID: "8e822040-7ff2-49be-bd68-70dc15db9ff9"). InnerVolumeSpecName "kube-api-access-hsn7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.788070 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hllxc" event={"ID":"430ee024-3291-4a26-865f-4d1300bf5ea9","Type":"ContainerDied","Data":"26306bfb836e702e94c3e1ac81577838aa9db1fbb326f967020a17d9f060cabf"} Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.788147 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26306bfb836e702e94c3e1ac81577838aa9db1fbb326f967020a17d9f060cabf" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.788260 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hllxc" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.814560 4693 generic.go:334] "Generic (PLEG): container finished" podID="35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9" containerID="87cf49e446b0b4b93cd6728145979739363d1b230f474bc59c7efaa8099f23ae" exitCode=143 Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.814790 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d6f6d8ffd-tv96v" event={"ID":"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9","Type":"ContainerDied","Data":"87cf49e446b0b4b93cd6728145979739363d1b230f474bc59c7efaa8099f23ae"} Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.840494 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4252s" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.840492 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4252s" event={"ID":"00409360-0d5e-4451-b83d-84fbbf011c66","Type":"ContainerDied","Data":"9043856f03a25f833e89e8113898e562ee3e168e9c4a3a464c6449f22ddbfb92"} Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.840578 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9043856f03a25f833e89e8113898e562ee3e168e9c4a3a464c6449f22ddbfb92" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.845512 4693 generic.go:334] "Generic (PLEG): container finished" podID="a0dd0393-4f73-4cbd-be2a-4b4471ea154c" containerID="7a63ce2b45cffe3f9b5a685e4ac1159c6ded18084f9dae5f791bdd6fb3752530" exitCode=0 Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.846409 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-752a-account-create-update-xdxqj" event={"ID":"a0dd0393-4f73-4cbd-be2a-4b4471ea154c","Type":"ContainerDied","Data":"7a63ce2b45cffe3f9b5a685e4ac1159c6ded18084f9dae5f791bdd6fb3752530"} Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.850667 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tjkb9" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.851666 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tjkb9" event={"ID":"8e822040-7ff2-49be-bd68-70dc15db9ff9","Type":"ContainerDied","Data":"4b08a49d277eb92e7af42179c87b74838058955ac195aa9471c9f746f4344c3b"} Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.851730 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b08a49d277eb92e7af42179c87b74838058955ac195aa9471c9f746f4344c3b" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.854157 4693 generic.go:334] "Generic (PLEG): container finished" podID="066e8361-fd94-4e1d-8727-25a9a432f8e0" containerID="e9f75cc688350b4628d50a2d8c4444af6e55ab3be187a5098e317699d2f37224" exitCode=0 Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.854201 4693 generic.go:334] "Generic (PLEG): container finished" podID="066e8361-fd94-4e1d-8727-25a9a432f8e0" containerID="0ce85455c29d0eaa365286afe37bf6f2fe92a7781e24d3d6ef3c781446c9dc5a" exitCode=143 Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.854539 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"066e8361-fd94-4e1d-8727-25a9a432f8e0","Type":"ContainerDied","Data":"e9f75cc688350b4628d50a2d8c4444af6e55ab3be187a5098e317699d2f37224"} Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.854566 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"066e8361-fd94-4e1d-8727-25a9a432f8e0","Type":"ContainerDied","Data":"0ce85455c29d0eaa365286afe37bf6f2fe92a7781e24d3d6ef3c781446c9dc5a"} Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.865290 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e822040-7ff2-49be-bd68-70dc15db9ff9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.865349 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xdkb\" (UniqueName: \"kubernetes.io/projected/00409360-0d5e-4451-b83d-84fbbf011c66-kube-api-access-6xdkb\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.865361 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsn7j\" (UniqueName: \"kubernetes.io/projected/8e822040-7ff2-49be-bd68-70dc15db9ff9-kube-api-access-hsn7j\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:34 crc kubenswrapper[4693]: I1204 10:06:34.865370 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjfxt\" (UniqueName: \"kubernetes.io/projected/430ee024-3291-4a26-865f-4d1300bf5ea9-kube-api-access-sjfxt\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.178049 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.200725 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.289260 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-sg-core-conf-yaml\") pod \"099d5d4d-d73b-442f-8210-d65c8e2a8317\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.289403 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/066e8361-fd94-4e1d-8727-25a9a432f8e0-etc-machine-id\") pod \"066e8361-fd94-4e1d-8727-25a9a432f8e0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.289435 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-config-data-custom\") pod \"066e8361-fd94-4e1d-8727-25a9a432f8e0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.289502 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-config-data\") pod \"099d5d4d-d73b-442f-8210-d65c8e2a8317\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.289591 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-combined-ca-bundle\") pod \"066e8361-fd94-4e1d-8727-25a9a432f8e0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.289627 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-scripts\") pod \"099d5d4d-d73b-442f-8210-d65c8e2a8317\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.290561 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8cqz\" (UniqueName: \"kubernetes.io/projected/066e8361-fd94-4e1d-8727-25a9a432f8e0-kube-api-access-k8cqz\") pod \"066e8361-fd94-4e1d-8727-25a9a432f8e0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.290594 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-config-data\") pod \"066e8361-fd94-4e1d-8727-25a9a432f8e0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.290623 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-combined-ca-bundle\") pod \"099d5d4d-d73b-442f-8210-d65c8e2a8317\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.290748 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/099d5d4d-d73b-442f-8210-d65c8e2a8317-log-httpd\") pod \"099d5d4d-d73b-442f-8210-d65c8e2a8317\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.290782 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-scripts\") pod \"066e8361-fd94-4e1d-8727-25a9a432f8e0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.290811 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwqp6\" (UniqueName: \"kubernetes.io/projected/099d5d4d-d73b-442f-8210-d65c8e2a8317-kube-api-access-bwqp6\") pod \"099d5d4d-d73b-442f-8210-d65c8e2a8317\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.290833 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/066e8361-fd94-4e1d-8727-25a9a432f8e0-logs\") pod \"066e8361-fd94-4e1d-8727-25a9a432f8e0\" (UID: \"066e8361-fd94-4e1d-8727-25a9a432f8e0\") " Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.290984 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/099d5d4d-d73b-442f-8210-d65c8e2a8317-run-httpd\") pod \"099d5d4d-d73b-442f-8210-d65c8e2a8317\" (UID: \"099d5d4d-d73b-442f-8210-d65c8e2a8317\") " Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.292036 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/099d5d4d-d73b-442f-8210-d65c8e2a8317-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "099d5d4d-d73b-442f-8210-d65c8e2a8317" (UID: "099d5d4d-d73b-442f-8210-d65c8e2a8317"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.302510 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/066e8361-fd94-4e1d-8727-25a9a432f8e0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "066e8361-fd94-4e1d-8727-25a9a432f8e0" (UID: "066e8361-fd94-4e1d-8727-25a9a432f8e0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.303209 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/066e8361-fd94-4e1d-8727-25a9a432f8e0-logs" (OuterVolumeSpecName: "logs") pod "066e8361-fd94-4e1d-8727-25a9a432f8e0" (UID: "066e8361-fd94-4e1d-8727-25a9a432f8e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.305037 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-scripts" (OuterVolumeSpecName: "scripts") pod "099d5d4d-d73b-442f-8210-d65c8e2a8317" (UID: "099d5d4d-d73b-442f-8210-d65c8e2a8317"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.322953 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/099d5d4d-d73b-442f-8210-d65c8e2a8317-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "099d5d4d-d73b-442f-8210-d65c8e2a8317" (UID: "099d5d4d-d73b-442f-8210-d65c8e2a8317"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.326300 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "066e8361-fd94-4e1d-8727-25a9a432f8e0" (UID: "066e8361-fd94-4e1d-8727-25a9a432f8e0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.326599 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/066e8361-fd94-4e1d-8727-25a9a432f8e0-kube-api-access-k8cqz" (OuterVolumeSpecName: "kube-api-access-k8cqz") pod "066e8361-fd94-4e1d-8727-25a9a432f8e0" (UID: "066e8361-fd94-4e1d-8727-25a9a432f8e0"). InnerVolumeSpecName "kube-api-access-k8cqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.334576 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-scripts" (OuterVolumeSpecName: "scripts") pod "066e8361-fd94-4e1d-8727-25a9a432f8e0" (UID: "066e8361-fd94-4e1d-8727-25a9a432f8e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.344513 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/099d5d4d-d73b-442f-8210-d65c8e2a8317-kube-api-access-bwqp6" (OuterVolumeSpecName: "kube-api-access-bwqp6") pod "099d5d4d-d73b-442f-8210-d65c8e2a8317" (UID: "099d5d4d-d73b-442f-8210-d65c8e2a8317"). InnerVolumeSpecName "kube-api-access-bwqp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.383873 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "066e8361-fd94-4e1d-8727-25a9a432f8e0" (UID: "066e8361-fd94-4e1d-8727-25a9a432f8e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.393990 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.394027 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.394037 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8cqz\" (UniqueName: \"kubernetes.io/projected/066e8361-fd94-4e1d-8727-25a9a432f8e0-kube-api-access-k8cqz\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.394048 4693 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/099d5d4d-d73b-442f-8210-d65c8e2a8317-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.394057 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.394065 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwqp6\" (UniqueName: \"kubernetes.io/projected/099d5d4d-d73b-442f-8210-d65c8e2a8317-kube-api-access-bwqp6\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.394073 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/066e8361-fd94-4e1d-8727-25a9a432f8e0-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.394081 4693 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/099d5d4d-d73b-442f-8210-d65c8e2a8317-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.394092 4693 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/066e8361-fd94-4e1d-8727-25a9a432f8e0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.394100 4693 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.446508 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "099d5d4d-d73b-442f-8210-d65c8e2a8317" (UID: "099d5d4d-d73b-442f-8210-d65c8e2a8317"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.491617 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "099d5d4d-d73b-442f-8210-d65c8e2a8317" (UID: "099d5d4d-d73b-442f-8210-d65c8e2a8317"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.498914 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0a0a-account-create-update-5cbb7" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.547253 4693 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.547302 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.610567 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f3cd-account-create-update-5sv79" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.648771 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e19a65e-3f81-4663-8041-8f2186d3d6c2-operator-scripts\") pod \"1e19a65e-3f81-4663-8041-8f2186d3d6c2\" (UID: \"1e19a65e-3f81-4663-8041-8f2186d3d6c2\") " Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.648858 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gnlj\" (UniqueName: \"kubernetes.io/projected/1e19a65e-3f81-4663-8041-8f2186d3d6c2-kube-api-access-6gnlj\") pod \"1e19a65e-3f81-4663-8041-8f2186d3d6c2\" (UID: \"1e19a65e-3f81-4663-8041-8f2186d3d6c2\") " Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.652276 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e19a65e-3f81-4663-8041-8f2186d3d6c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e19a65e-3f81-4663-8041-8f2186d3d6c2" (UID: "1e19a65e-3f81-4663-8041-8f2186d3d6c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.686074 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e19a65e-3f81-4663-8041-8f2186d3d6c2-kube-api-access-6gnlj" (OuterVolumeSpecName: "kube-api-access-6gnlj") pod "1e19a65e-3f81-4663-8041-8f2186d3d6c2" (UID: "1e19a65e-3f81-4663-8041-8f2186d3d6c2"). InnerVolumeSpecName "kube-api-access-6gnlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.688662 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-config-data" (OuterVolumeSpecName: "config-data") pod "099d5d4d-d73b-442f-8210-d65c8e2a8317" (UID: "099d5d4d-d73b-442f-8210-d65c8e2a8317"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.696582 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-config-data" (OuterVolumeSpecName: "config-data") pod "066e8361-fd94-4e1d-8727-25a9a432f8e0" (UID: "066e8361-fd94-4e1d-8727-25a9a432f8e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.756039 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2ghd\" (UniqueName: \"kubernetes.io/projected/0805403f-8f31-4183-a1b6-d1eedcb64a8d-kube-api-access-q2ghd\") pod \"0805403f-8f31-4183-a1b6-d1eedcb64a8d\" (UID: \"0805403f-8f31-4183-a1b6-d1eedcb64a8d\") " Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.756191 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0805403f-8f31-4183-a1b6-d1eedcb64a8d-operator-scripts\") pod \"0805403f-8f31-4183-a1b6-d1eedcb64a8d\" (UID: \"0805403f-8f31-4183-a1b6-d1eedcb64a8d\") " Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.756881 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/099d5d4d-d73b-442f-8210-d65c8e2a8317-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.756912 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/066e8361-fd94-4e1d-8727-25a9a432f8e0-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.756927 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e19a65e-3f81-4663-8041-8f2186d3d6c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.756945 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gnlj\" (UniqueName: \"kubernetes.io/projected/1e19a65e-3f81-4663-8041-8f2186d3d6c2-kube-api-access-6gnlj\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.757445 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0805403f-8f31-4183-a1b6-d1eedcb64a8d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0805403f-8f31-4183-a1b6-d1eedcb64a8d" (UID: "0805403f-8f31-4183-a1b6-d1eedcb64a8d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.769402 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0805403f-8f31-4183-a1b6-d1eedcb64a8d-kube-api-access-q2ghd" (OuterVolumeSpecName: "kube-api-access-q2ghd") pod "0805403f-8f31-4183-a1b6-d1eedcb64a8d" (UID: "0805403f-8f31-4183-a1b6-d1eedcb64a8d"). InnerVolumeSpecName "kube-api-access-q2ghd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.859403 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0805403f-8f31-4183-a1b6-d1eedcb64a8d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.859438 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2ghd\" (UniqueName: \"kubernetes.io/projected/0805403f-8f31-4183-a1b6-d1eedcb64a8d-kube-api-access-q2ghd\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.869212 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f3cd-account-create-update-5sv79" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.869198 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f3cd-account-create-update-5sv79" event={"ID":"0805403f-8f31-4183-a1b6-d1eedcb64a8d","Type":"ContainerDied","Data":"15aff1ca0da47a4c63e3bea4257ed243ec53650d4220aaec986b20d91630f5d5"} Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.869385 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15aff1ca0da47a4c63e3bea4257ed243ec53650d4220aaec986b20d91630f5d5" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.871112 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0a0a-account-create-update-5cbb7" event={"ID":"1e19a65e-3f81-4663-8041-8f2186d3d6c2","Type":"ContainerDied","Data":"b66c5b881f9db468780feac914fdfb1daa5dbbc794f764ae5b23f827b42d6ff7"} Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.871136 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0a0a-account-create-update-5cbb7" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.871145 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b66c5b881f9db468780feac914fdfb1daa5dbbc794f764ae5b23f827b42d6ff7" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.873189 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"066e8361-fd94-4e1d-8727-25a9a432f8e0","Type":"ContainerDied","Data":"3f91298bff0ed46550b19fdbf5e9a167e86b5b56c0cee309b7482a9ef4a1bcec"} Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.873239 4693 scope.go:117] "RemoveContainer" containerID="e9f75cc688350b4628d50a2d8c4444af6e55ab3be187a5098e317699d2f37224" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.873512 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.882975 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.883190 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"099d5d4d-d73b-442f-8210-d65c8e2a8317","Type":"ContainerDied","Data":"8b9825a003f2852c43372176787479ff059af33d693c842dccdfb1b481209b76"} Dec 04 10:06:35 crc kubenswrapper[4693]: I1204 10:06:35.966023 4693 scope.go:117] "RemoveContainer" containerID="0ce85455c29d0eaa365286afe37bf6f2fe92a7781e24d3d6ef3c781446c9dc5a" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.011084 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.036857 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.091792 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.114295 4693 scope.go:117] "RemoveContainer" containerID="31eeb9befd7aebe78701a71e335aeef29755fe447d9459ca11df2e55d62cfc27" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.128414 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.173246 4693 scope.go:117] "RemoveContainer" containerID="1e6df36b51fb539886f770c28a8e725218ee29df6695ef24b218174a1e168edf" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.186852 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:06:36 crc kubenswrapper[4693]: E1204 10:06:36.187559 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="430ee024-3291-4a26-865f-4d1300bf5ea9" containerName="mariadb-database-create" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.187578 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="430ee024-3291-4a26-865f-4d1300bf5ea9" containerName="mariadb-database-create" Dec 04 10:06:36 crc kubenswrapper[4693]: E1204 10:06:36.187591 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00409360-0d5e-4451-b83d-84fbbf011c66" containerName="mariadb-database-create" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.187597 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="00409360-0d5e-4451-b83d-84fbbf011c66" containerName="mariadb-database-create" Dec 04 10:06:36 crc kubenswrapper[4693]: E1204 10:06:36.187614 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerName="ceilometer-notification-agent" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.187624 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerName="ceilometer-notification-agent" Dec 04 10:06:36 crc kubenswrapper[4693]: E1204 10:06:36.187635 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066e8361-fd94-4e1d-8727-25a9a432f8e0" containerName="cinder-api-log" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.187641 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="066e8361-fd94-4e1d-8727-25a9a432f8e0" containerName="cinder-api-log" Dec 04 10:06:36 crc kubenswrapper[4693]: E1204 10:06:36.187658 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerName="proxy-httpd" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.187664 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerName="proxy-httpd" Dec 04 10:06:36 crc kubenswrapper[4693]: E1204 10:06:36.187683 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066e8361-fd94-4e1d-8727-25a9a432f8e0" containerName="cinder-api" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.187689 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="066e8361-fd94-4e1d-8727-25a9a432f8e0" containerName="cinder-api" Dec 04 10:06:36 crc kubenswrapper[4693]: E1204 10:06:36.187709 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e19a65e-3f81-4663-8041-8f2186d3d6c2" containerName="mariadb-account-create-update" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.187714 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e19a65e-3f81-4663-8041-8f2186d3d6c2" containerName="mariadb-account-create-update" Dec 04 10:06:36 crc kubenswrapper[4693]: E1204 10:06:36.187724 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerName="sg-core" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.187730 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerName="sg-core" Dec 04 10:06:36 crc kubenswrapper[4693]: E1204 10:06:36.187753 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0805403f-8f31-4183-a1b6-d1eedcb64a8d" containerName="mariadb-account-create-update" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.187759 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0805403f-8f31-4183-a1b6-d1eedcb64a8d" containerName="mariadb-account-create-update" Dec 04 10:06:36 crc kubenswrapper[4693]: E1204 10:06:36.187769 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerName="ceilometer-central-agent" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.187776 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerName="ceilometer-central-agent" Dec 04 10:06:36 crc kubenswrapper[4693]: E1204 10:06:36.187786 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e822040-7ff2-49be-bd68-70dc15db9ff9" containerName="mariadb-database-create" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.187822 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e822040-7ff2-49be-bd68-70dc15db9ff9" containerName="mariadb-database-create" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.188018 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerName="ceilometer-notification-agent" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.188030 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="066e8361-fd94-4e1d-8727-25a9a432f8e0" containerName="cinder-api-log" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.188039 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="430ee024-3291-4a26-865f-4d1300bf5ea9" containerName="mariadb-database-create" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.188053 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerName="ceilometer-central-agent" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.188060 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="0805403f-8f31-4183-a1b6-d1eedcb64a8d" containerName="mariadb-account-create-update" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.188068 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e19a65e-3f81-4663-8041-8f2186d3d6c2" containerName="mariadb-account-create-update" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.188077 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e822040-7ff2-49be-bd68-70dc15db9ff9" containerName="mariadb-database-create" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.188089 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerName="sg-core" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.188099 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="099d5d4d-d73b-442f-8210-d65c8e2a8317" containerName="proxy-httpd" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.188111 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="00409360-0d5e-4451-b83d-84fbbf011c66" containerName="mariadb-database-create" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.188128 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="066e8361-fd94-4e1d-8727-25a9a432f8e0" containerName="cinder-api" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.190307 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.196827 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.197092 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.230483 4693 scope.go:117] "RemoveContainer" containerID="a213ec1caa55e687834ca38482091cc97cc8a8a46e7b7c50738ec3d3084968c6" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.259881 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.262652 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.267712 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.268081 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.268271 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.269794 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20a8b38f-6f4c-494b-839f-dfeebb7f043e-run-httpd\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.270024 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20a8b38f-6f4c-494b-839f-dfeebb7f043e-log-httpd\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.270247 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.270507 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-config-data\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.270615 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdw87\" (UniqueName: \"kubernetes.io/projected/20a8b38f-6f4c-494b-839f-dfeebb7f043e-kube-api-access-jdw87\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.270774 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-scripts\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.270970 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.286696 4693 scope.go:117] "RemoveContainer" containerID="cb1c2b2a7ebfb705d9d4092b900bd3f05395c980257a42dec3bf73c1bdd47704" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.294664 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.309507 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.373067 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfcdc882-9b7b-4e42-877e-6e8be8597470-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.373139 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-scripts\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.373170 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfcdc882-9b7b-4e42-877e-6e8be8597470-config-data-custom\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.373216 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfcdc882-9b7b-4e42-877e-6e8be8597470-public-tls-certs\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.373256 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.373291 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20a8b38f-6f4c-494b-839f-dfeebb7f043e-run-httpd\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.373323 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfcdc882-9b7b-4e42-877e-6e8be8597470-scripts\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.373366 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20a8b38f-6f4c-494b-839f-dfeebb7f043e-log-httpd\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.373409 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfcdc882-9b7b-4e42-877e-6e8be8597470-config-data\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.373426 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfcdc882-9b7b-4e42-877e-6e8be8597470-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.373442 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfcdc882-9b7b-4e42-877e-6e8be8597470-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.373498 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm4d8\" (UniqueName: \"kubernetes.io/projected/bfcdc882-9b7b-4e42-877e-6e8be8597470-kube-api-access-jm4d8\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.373516 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.373539 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-config-data\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.373562 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdw87\" (UniqueName: \"kubernetes.io/projected/20a8b38f-6f4c-494b-839f-dfeebb7f043e-kube-api-access-jdw87\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.373592 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfcdc882-9b7b-4e42-877e-6e8be8597470-logs\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.376240 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20a8b38f-6f4c-494b-839f-dfeebb7f043e-run-httpd\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.376516 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20a8b38f-6f4c-494b-839f-dfeebb7f043e-log-httpd\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.381551 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.385997 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-scripts\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.386952 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-config-data\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.395999 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.408982 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdw87\" (UniqueName: \"kubernetes.io/projected/20a8b38f-6f4c-494b-839f-dfeebb7f043e-kube-api-access-jdw87\") pod \"ceilometer-0\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.475644 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfcdc882-9b7b-4e42-877e-6e8be8597470-scripts\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.476194 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfcdc882-9b7b-4e42-877e-6e8be8597470-config-data\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.476298 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfcdc882-9b7b-4e42-877e-6e8be8597470-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.476361 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfcdc882-9b7b-4e42-877e-6e8be8597470-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.476487 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm4d8\" (UniqueName: \"kubernetes.io/projected/bfcdc882-9b7b-4e42-877e-6e8be8597470-kube-api-access-jm4d8\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.476634 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfcdc882-9b7b-4e42-877e-6e8be8597470-logs\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.476701 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfcdc882-9b7b-4e42-877e-6e8be8597470-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.476754 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfcdc882-9b7b-4e42-877e-6e8be8597470-config-data-custom\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.476857 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfcdc882-9b7b-4e42-877e-6e8be8597470-public-tls-certs\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.477635 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfcdc882-9b7b-4e42-877e-6e8be8597470-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.478190 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfcdc882-9b7b-4e42-877e-6e8be8597470-logs\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.482352 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfcdc882-9b7b-4e42-877e-6e8be8597470-config-data\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.485193 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="066e8361-fd94-4e1d-8727-25a9a432f8e0" path="/var/lib/kubelet/pods/066e8361-fd94-4e1d-8727-25a9a432f8e0/volumes" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.486358 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="099d5d4d-d73b-442f-8210-d65c8e2a8317" path="/var/lib/kubelet/pods/099d5d4d-d73b-442f-8210-d65c8e2a8317/volumes" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.487126 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfcdc882-9b7b-4e42-877e-6e8be8597470-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.508452 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfcdc882-9b7b-4e42-877e-6e8be8597470-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.508959 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfcdc882-9b7b-4e42-877e-6e8be8597470-scripts\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.509302 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm4d8\" (UniqueName: \"kubernetes.io/projected/bfcdc882-9b7b-4e42-877e-6e8be8597470-kube-api-access-jm4d8\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.511323 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-752a-account-create-update-xdxqj" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.512044 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfcdc882-9b7b-4e42-877e-6e8be8597470-public-tls-certs\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.523270 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfcdc882-9b7b-4e42-877e-6e8be8597470-config-data-custom\") pod \"cinder-api-0\" (UID: \"bfcdc882-9b7b-4e42-877e-6e8be8597470\") " pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.527223 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.578985 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0dd0393-4f73-4cbd-be2a-4b4471ea154c-operator-scripts\") pod \"a0dd0393-4f73-4cbd-be2a-4b4471ea154c\" (UID: \"a0dd0393-4f73-4cbd-be2a-4b4471ea154c\") " Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.579136 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4blf\" (UniqueName: \"kubernetes.io/projected/a0dd0393-4f73-4cbd-be2a-4b4471ea154c-kube-api-access-k4blf\") pod \"a0dd0393-4f73-4cbd-be2a-4b4471ea154c\" (UID: \"a0dd0393-4f73-4cbd-be2a-4b4471ea154c\") " Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.579477 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0dd0393-4f73-4cbd-be2a-4b4471ea154c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0dd0393-4f73-4cbd-be2a-4b4471ea154c" (UID: "a0dd0393-4f73-4cbd-be2a-4b4471ea154c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.579777 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0dd0393-4f73-4cbd-be2a-4b4471ea154c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.593554 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0dd0393-4f73-4cbd-be2a-4b4471ea154c-kube-api-access-k4blf" (OuterVolumeSpecName: "kube-api-access-k4blf") pod "a0dd0393-4f73-4cbd-be2a-4b4471ea154c" (UID: "a0dd0393-4f73-4cbd-be2a-4b4471ea154c"). InnerVolumeSpecName "kube-api-access-k4blf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.690238 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4blf\" (UniqueName: \"kubernetes.io/projected/a0dd0393-4f73-4cbd-be2a-4b4471ea154c-kube-api-access-k4blf\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.794361 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.884871 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.962452 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-kwjv6"] Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.962752 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" podUID="c964bd03-cc2a-461a-a3ad-3e8118ed8a82" containerName="dnsmasq-dns" containerID="cri-o://8793797bd24fff2ea3923e004a7956356e142b031b98115da1dd1d989a19b44b" gracePeriod=10 Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.985974 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-752a-account-create-update-xdxqj" event={"ID":"a0dd0393-4f73-4cbd-be2a-4b4471ea154c","Type":"ContainerDied","Data":"57c0550fdc059f86a479d5764758fdf15374fd96b1331069bf1bb0674057b8b4"} Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.989449 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57c0550fdc059f86a479d5764758fdf15374fd96b1331069bf1bb0674057b8b4" Dec 04 10:06:36 crc kubenswrapper[4693]: I1204 10:06:36.989558 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-752a-account-create-update-xdxqj" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.019868 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.039897 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.044366 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="28d812a2-263f-491a-8804-94ab52f3c3c7" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.169:8080/\": dial tcp 10.217.0.169:8080: connect: connection refused" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.063679 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.135184 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d6f6d8ffd-tv96v" podUID="35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:47188->10.217.0.163:9311: read: connection reset by peer" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.135375 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d6f6d8ffd-tv96v" podUID="35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:47172->10.217.0.163:9311: read: connection reset by peer" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.162299 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.381044 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.411163 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-volume-volume1-0" podUID="87f05066-df1a-4762-a093-fb4485e060f7" containerName="cinder-volume" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.544600 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-backup-0" podUID="94261d53-a51c-4f5b-b896-87a957c93c86" containerName="cinder-backup" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.693909 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.750343 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.855297 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzf5r\" (UniqueName: \"kubernetes.io/projected/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-kube-api-access-xzf5r\") pod \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.855363 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-logs\") pod \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.855470 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-dns-svc\") pod \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.855507 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvsbv\" (UniqueName: \"kubernetes.io/projected/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-kube-api-access-mvsbv\") pod \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.855542 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-combined-ca-bundle\") pod \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.855587 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-dns-swift-storage-0\") pod \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.855633 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-ovsdbserver-sb\") pod \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.855687 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-config\") pod \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.855711 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-ovsdbserver-nb\") pod \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\" (UID: \"c964bd03-cc2a-461a-a3ad-3e8118ed8a82\") " Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.855758 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-config-data\") pod \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.855820 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-config-data-custom\") pod \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\" (UID: \"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9\") " Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.863655 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-logs" (OuterVolumeSpecName: "logs") pod "35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9" (UID: "35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.870645 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-kube-api-access-xzf5r" (OuterVolumeSpecName: "kube-api-access-xzf5r") pod "c964bd03-cc2a-461a-a3ad-3e8118ed8a82" (UID: "c964bd03-cc2a-461a-a3ad-3e8118ed8a82"). InnerVolumeSpecName "kube-api-access-xzf5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.872775 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9" (UID: "35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.888593 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-kube-api-access-mvsbv" (OuterVolumeSpecName: "kube-api-access-mvsbv") pod "35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9" (UID: "35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9"). InnerVolumeSpecName "kube-api-access-mvsbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.914690 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9" (UID: "35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.945668 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-config" (OuterVolumeSpecName: "config") pod "c964bd03-cc2a-461a-a3ad-3e8118ed8a82" (UID: "c964bd03-cc2a-461a-a3ad-3e8118ed8a82"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.952770 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-config-data" (OuterVolumeSpecName: "config-data") pod "35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9" (UID: "35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.963187 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.963226 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.963238 4693 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.963251 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzf5r\" (UniqueName: \"kubernetes.io/projected/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-kube-api-access-xzf5r\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.963264 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.963276 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvsbv\" (UniqueName: \"kubernetes.io/projected/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-kube-api-access-mvsbv\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.963284 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.968116 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c964bd03-cc2a-461a-a3ad-3e8118ed8a82" (UID: "c964bd03-cc2a-461a-a3ad-3e8118ed8a82"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:37 crc kubenswrapper[4693]: I1204 10:06:37.997178 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c964bd03-cc2a-461a-a3ad-3e8118ed8a82" (UID: "c964bd03-cc2a-461a-a3ad-3e8118ed8a82"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.035878 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c964bd03-cc2a-461a-a3ad-3e8118ed8a82" (UID: "c964bd03-cc2a-461a-a3ad-3e8118ed8a82"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.057421 4693 generic.go:334] "Generic (PLEG): container finished" podID="c964bd03-cc2a-461a-a3ad-3e8118ed8a82" containerID="8793797bd24fff2ea3923e004a7956356e142b031b98115da1dd1d989a19b44b" exitCode=0 Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.057544 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" event={"ID":"c964bd03-cc2a-461a-a3ad-3e8118ed8a82","Type":"ContainerDied","Data":"8793797bd24fff2ea3923e004a7956356e142b031b98115da1dd1d989a19b44b"} Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.057587 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" event={"ID":"c964bd03-cc2a-461a-a3ad-3e8118ed8a82","Type":"ContainerDied","Data":"eb1382ad5b6b5ea6fae11f4a78942f6b2a23c17e49452098efdf9ce049ac9b84"} Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.057636 4693 scope.go:117] "RemoveContainer" containerID="8793797bd24fff2ea3923e004a7956356e142b031b98115da1dd1d989a19b44b" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.057810 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-kwjv6" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.062814 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20a8b38f-6f4c-494b-839f-dfeebb7f043e","Type":"ContainerStarted","Data":"7af16363c3be5d88312f098ba5dd456eafbd3067534f81878b21c0067380ff40"} Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.068101 4693 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.068281 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.068394 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.070603 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfcdc882-9b7b-4e42-877e-6e8be8597470","Type":"ContainerStarted","Data":"3f8a1f03ffd468b58e9d414b07bf97878ca4ef4705883bd66ffd97b10393ffbd"} Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.071885 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c964bd03-cc2a-461a-a3ad-3e8118ed8a82" (UID: "c964bd03-cc2a-461a-a3ad-3e8118ed8a82"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.084237 4693 generic.go:334] "Generic (PLEG): container finished" podID="35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9" containerID="db046315a0cfd86af952ec142e9da3383d728fd84157270abdca664980541dd9" exitCode=0 Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.084324 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d6f6d8ffd-tv96v" event={"ID":"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9","Type":"ContainerDied","Data":"db046315a0cfd86af952ec142e9da3383d728fd84157270abdca664980541dd9"} Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.084398 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d6f6d8ffd-tv96v" event={"ID":"35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9","Type":"ContainerDied","Data":"e1e568e454553cec9a27f463fb6404bcc7b8cb42c46760148bbfe3bd0c5a3728"} Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.084523 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d6f6d8ffd-tv96v" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.171845 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c964bd03-cc2a-461a-a3ad-3e8118ed8a82-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.207928 4693 scope.go:117] "RemoveContainer" containerID="e384047c246a464a459f3934942776cded13bdb67e07d365f3f5311826c36834" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.222536 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d6f6d8ffd-tv96v"] Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.237528 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-d6f6d8ffd-tv96v"] Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.243917 4693 scope.go:117] "RemoveContainer" containerID="8793797bd24fff2ea3923e004a7956356e142b031b98115da1dd1d989a19b44b" Dec 04 10:06:38 crc kubenswrapper[4693]: E1204 10:06:38.245175 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8793797bd24fff2ea3923e004a7956356e142b031b98115da1dd1d989a19b44b\": container with ID starting with 8793797bd24fff2ea3923e004a7956356e142b031b98115da1dd1d989a19b44b not found: ID does not exist" containerID="8793797bd24fff2ea3923e004a7956356e142b031b98115da1dd1d989a19b44b" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.245228 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8793797bd24fff2ea3923e004a7956356e142b031b98115da1dd1d989a19b44b"} err="failed to get container status \"8793797bd24fff2ea3923e004a7956356e142b031b98115da1dd1d989a19b44b\": rpc error: code = NotFound desc = could not find container \"8793797bd24fff2ea3923e004a7956356e142b031b98115da1dd1d989a19b44b\": container with ID starting with 8793797bd24fff2ea3923e004a7956356e142b031b98115da1dd1d989a19b44b not found: ID does not exist" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.245256 4693 scope.go:117] "RemoveContainer" containerID="e384047c246a464a459f3934942776cded13bdb67e07d365f3f5311826c36834" Dec 04 10:06:38 crc kubenswrapper[4693]: E1204 10:06:38.253486 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e384047c246a464a459f3934942776cded13bdb67e07d365f3f5311826c36834\": container with ID starting with e384047c246a464a459f3934942776cded13bdb67e07d365f3f5311826c36834 not found: ID does not exist" containerID="e384047c246a464a459f3934942776cded13bdb67e07d365f3f5311826c36834" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.253779 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e384047c246a464a459f3934942776cded13bdb67e07d365f3f5311826c36834"} err="failed to get container status \"e384047c246a464a459f3934942776cded13bdb67e07d365f3f5311826c36834\": rpc error: code = NotFound desc = could not find container \"e384047c246a464a459f3934942776cded13bdb67e07d365f3f5311826c36834\": container with ID starting with e384047c246a464a459f3934942776cded13bdb67e07d365f3f5311826c36834 not found: ID does not exist" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.254345 4693 scope.go:117] "RemoveContainer" containerID="db046315a0cfd86af952ec142e9da3383d728fd84157270abdca664980541dd9" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.322822 4693 scope.go:117] "RemoveContainer" containerID="87cf49e446b0b4b93cd6728145979739363d1b230f474bc59c7efaa8099f23ae" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.406605 4693 scope.go:117] "RemoveContainer" containerID="db046315a0cfd86af952ec142e9da3383d728fd84157270abdca664980541dd9" Dec 04 10:06:38 crc kubenswrapper[4693]: E1204 10:06:38.407102 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db046315a0cfd86af952ec142e9da3383d728fd84157270abdca664980541dd9\": container with ID starting with db046315a0cfd86af952ec142e9da3383d728fd84157270abdca664980541dd9 not found: ID does not exist" containerID="db046315a0cfd86af952ec142e9da3383d728fd84157270abdca664980541dd9" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.407128 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db046315a0cfd86af952ec142e9da3383d728fd84157270abdca664980541dd9"} err="failed to get container status \"db046315a0cfd86af952ec142e9da3383d728fd84157270abdca664980541dd9\": rpc error: code = NotFound desc = could not find container \"db046315a0cfd86af952ec142e9da3383d728fd84157270abdca664980541dd9\": container with ID starting with db046315a0cfd86af952ec142e9da3383d728fd84157270abdca664980541dd9 not found: ID does not exist" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.407151 4693 scope.go:117] "RemoveContainer" containerID="87cf49e446b0b4b93cd6728145979739363d1b230f474bc59c7efaa8099f23ae" Dec 04 10:06:38 crc kubenswrapper[4693]: E1204 10:06:38.407434 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87cf49e446b0b4b93cd6728145979739363d1b230f474bc59c7efaa8099f23ae\": container with ID starting with 87cf49e446b0b4b93cd6728145979739363d1b230f474bc59c7efaa8099f23ae not found: ID does not exist" containerID="87cf49e446b0b4b93cd6728145979739363d1b230f474bc59c7efaa8099f23ae" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.407452 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87cf49e446b0b4b93cd6728145979739363d1b230f474bc59c7efaa8099f23ae"} err="failed to get container status \"87cf49e446b0b4b93cd6728145979739363d1b230f474bc59c7efaa8099f23ae\": rpc error: code = NotFound desc = could not find container \"87cf49e446b0b4b93cd6728145979739363d1b230f474bc59c7efaa8099f23ae\": container with ID starting with 87cf49e446b0b4b93cd6728145979739363d1b230f474bc59c7efaa8099f23ae not found: ID does not exist" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.411574 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-kwjv6"] Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.420179 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-kwjv6"] Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.477757 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9" path="/var/lib/kubelet/pods/35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9/volumes" Dec 04 10:06:38 crc kubenswrapper[4693]: I1204 10:06:38.478649 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c964bd03-cc2a-461a-a3ad-3e8118ed8a82" path="/var/lib/kubelet/pods/c964bd03-cc2a-461a-a3ad-3e8118ed8a82/volumes" Dec 04 10:06:39 crc kubenswrapper[4693]: I1204 10:06:39.136704 4693 generic.go:334] "Generic (PLEG): container finished" podID="7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d" containerID="ceeddbea04eada8185ac7490c16a472d90714892ff0cec6ab7c2bb7bc1fcc931" exitCode=0 Dec 04 10:06:39 crc kubenswrapper[4693]: I1204 10:06:39.136835 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lvx99" event={"ID":"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d","Type":"ContainerDied","Data":"ceeddbea04eada8185ac7490c16a472d90714892ff0cec6ab7c2bb7bc1fcc931"} Dec 04 10:06:39 crc kubenswrapper[4693]: I1204 10:06:39.140379 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20a8b38f-6f4c-494b-839f-dfeebb7f043e","Type":"ContainerStarted","Data":"f20520c5b7e350733f2116daf84b7c70ee3d26f84ee0f089ea859846388b78ba"} Dec 04 10:06:39 crc kubenswrapper[4693]: I1204 10:06:39.146136 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfcdc882-9b7b-4e42-877e-6e8be8597470","Type":"ContainerStarted","Data":"f28a90da599d6fa61497e1ee3ad03e95de595fb4df0b1a0671627f3eff7a6cd0"} Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.105137 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ttkgx"] Dec 04 10:06:40 crc kubenswrapper[4693]: E1204 10:06:40.105771 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9" containerName="barbican-api-log" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.105788 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9" containerName="barbican-api-log" Dec 04 10:06:40 crc kubenswrapper[4693]: E1204 10:06:40.105805 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c964bd03-cc2a-461a-a3ad-3e8118ed8a82" containerName="init" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.105811 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c964bd03-cc2a-461a-a3ad-3e8118ed8a82" containerName="init" Dec 04 10:06:40 crc kubenswrapper[4693]: E1204 10:06:40.105822 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9" containerName="barbican-api" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.105827 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9" containerName="barbican-api" Dec 04 10:06:40 crc kubenswrapper[4693]: E1204 10:06:40.105854 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0dd0393-4f73-4cbd-be2a-4b4471ea154c" containerName="mariadb-account-create-update" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.105860 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0dd0393-4f73-4cbd-be2a-4b4471ea154c" containerName="mariadb-account-create-update" Dec 04 10:06:40 crc kubenswrapper[4693]: E1204 10:06:40.105871 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c964bd03-cc2a-461a-a3ad-3e8118ed8a82" containerName="dnsmasq-dns" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.105878 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c964bd03-cc2a-461a-a3ad-3e8118ed8a82" containerName="dnsmasq-dns" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.106060 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9" containerName="barbican-api" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.106074 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0dd0393-4f73-4cbd-be2a-4b4471ea154c" containerName="mariadb-account-create-update" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.106085 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="c964bd03-cc2a-461a-a3ad-3e8118ed8a82" containerName="dnsmasq-dns" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.106100 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="35aab557-aaf2-4bf7-b0e6-241f0f2ee0c9" containerName="barbican-api-log" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.106693 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ttkgx" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.116785 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.119763 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.120816 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ttkgx"] Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.120939 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-r828x" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.233215 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-config-data\") pod \"nova-cell0-conductor-db-sync-ttkgx\" (UID: \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\") " pod="openstack/nova-cell0-conductor-db-sync-ttkgx" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.233262 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-scripts\") pod \"nova-cell0-conductor-db-sync-ttkgx\" (UID: \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\") " pod="openstack/nova-cell0-conductor-db-sync-ttkgx" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.233287 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ttkgx\" (UID: \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\") " pod="openstack/nova-cell0-conductor-db-sync-ttkgx" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.233556 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldd4s\" (UniqueName: \"kubernetes.io/projected/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-kube-api-access-ldd4s\") pod \"nova-cell0-conductor-db-sync-ttkgx\" (UID: \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\") " pod="openstack/nova-cell0-conductor-db-sync-ttkgx" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.335435 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldd4s\" (UniqueName: \"kubernetes.io/projected/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-kube-api-access-ldd4s\") pod \"nova-cell0-conductor-db-sync-ttkgx\" (UID: \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\") " pod="openstack/nova-cell0-conductor-db-sync-ttkgx" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.335601 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-config-data\") pod \"nova-cell0-conductor-db-sync-ttkgx\" (UID: \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\") " pod="openstack/nova-cell0-conductor-db-sync-ttkgx" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.335625 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-scripts\") pod \"nova-cell0-conductor-db-sync-ttkgx\" (UID: \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\") " pod="openstack/nova-cell0-conductor-db-sync-ttkgx" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.335648 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ttkgx\" (UID: \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\") " pod="openstack/nova-cell0-conductor-db-sync-ttkgx" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.346053 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ttkgx\" (UID: \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\") " pod="openstack/nova-cell0-conductor-db-sync-ttkgx" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.356930 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldd4s\" (UniqueName: \"kubernetes.io/projected/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-kube-api-access-ldd4s\") pod \"nova-cell0-conductor-db-sync-ttkgx\" (UID: \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\") " pod="openstack/nova-cell0-conductor-db-sync-ttkgx" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.362937 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-scripts\") pod \"nova-cell0-conductor-db-sync-ttkgx\" (UID: \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\") " pod="openstack/nova-cell0-conductor-db-sync-ttkgx" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.363154 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-config-data\") pod \"nova-cell0-conductor-db-sync-ttkgx\" (UID: \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\") " pod="openstack/nova-cell0-conductor-db-sync-ttkgx" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.425711 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ttkgx" Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.930544 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ttkgx"] Dec 04 10:06:40 crc kubenswrapper[4693]: W1204 10:06:40.932289 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0a25a4b_56b5_461e_b9f0_44fe3b1f5e67.slice/crio-28219fe95efd9efcdea703e667f9d48b949443710076bedddac8c49088751127 WatchSource:0}: Error finding container 28219fe95efd9efcdea703e667f9d48b949443710076bedddac8c49088751127: Status 404 returned error can't find the container with id 28219fe95efd9efcdea703e667f9d48b949443710076bedddac8c49088751127 Dec 04 10:06:40 crc kubenswrapper[4693]: I1204 10:06:40.962314 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lvx99" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.059700 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-config-data\") pod \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\" (UID: \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\") " Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.059908 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-combined-ca-bundle\") pod \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\" (UID: \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\") " Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.060999 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-job-config-data\") pod \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\" (UID: \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\") " Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.061087 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcqzj\" (UniqueName: \"kubernetes.io/projected/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-kube-api-access-bcqzj\") pod \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\" (UID: \"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d\") " Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.066565 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-kube-api-access-bcqzj" (OuterVolumeSpecName: "kube-api-access-bcqzj") pod "7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d" (UID: "7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d"). InnerVolumeSpecName "kube-api-access-bcqzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.069291 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d" (UID: "7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.070625 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-config-data" (OuterVolumeSpecName: "config-data") pod "7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d" (UID: "7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.102621 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d" (UID: "7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.163578 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcqzj\" (UniqueName: \"kubernetes.io/projected/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-kube-api-access-bcqzj\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.163617 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.163629 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.163638 4693 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d-job-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.361617 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ttkgx" event={"ID":"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67","Type":"ContainerStarted","Data":"28219fe95efd9efcdea703e667f9d48b949443710076bedddac8c49088751127"} Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.363526 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lvx99" event={"ID":"7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d","Type":"ContainerDied","Data":"c1541dca6c712afeed5b633b60326b1cc5f4a5da804b36bbffa5c57f08fc1b03"} Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.363562 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lvx99" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.363584 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1541dca6c712afeed5b633b60326b1cc5f4a5da804b36bbffa5c57f08fc1b03" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.425257 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 10:06:41 crc kubenswrapper[4693]: E1204 10:06:41.425683 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d" containerName="manila-db-sync" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.425702 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d" containerName="manila-db-sync" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.425918 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d" containerName="manila-db-sync" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.426896 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.429569 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.430032 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.431520 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.432395 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-dbp65" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.441548 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.532799 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.534700 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.537288 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.548197 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.571310 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.571412 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50f43b88-4cfe-4cc7-95be-d377136831e4-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.571482 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-config-data\") pod \"manila-scheduler-0\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.571512 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ptjx\" (UniqueName: \"kubernetes.io/projected/50f43b88-4cfe-4cc7-95be-d377136831e4-kube-api-access-6ptjx\") pod \"manila-scheduler-0\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.571546 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-scripts\") pod \"manila-scheduler-0\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.571895 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.611688 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d6d889f-p9sjj"] Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.613172 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.655397 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d6d889f-p9sjj"] Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.673951 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nv27\" (UniqueName: \"kubernetes.io/projected/9a4f028e-3364-435d-8fef-234e98c9b6a1-kube-api-access-8nv27\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.674017 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.674048 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.674086 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50f43b88-4cfe-4cc7-95be-d377136831e4-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.674125 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-config-data\") pod \"manila-scheduler-0\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.674142 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ptjx\" (UniqueName: \"kubernetes.io/projected/50f43b88-4cfe-4cc7-95be-d377136831e4-kube-api-access-6ptjx\") pod \"manila-scheduler-0\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.674161 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-scripts\") pod \"manila-scheduler-0\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.674186 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/9a4f028e-3364-435d-8fef-234e98c9b6a1-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.674213 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.674256 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9a4f028e-3364-435d-8fef-234e98c9b6a1-ceph\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.674293 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-scripts\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.674324 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.674380 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a4f028e-3364-435d-8fef-234e98c9b6a1-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.674412 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-config-data\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.674732 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50f43b88-4cfe-4cc7-95be-d377136831e4-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.681450 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-scripts\") pod \"manila-scheduler-0\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.681624 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-config-data\") pod \"manila-scheduler-0\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.683517 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.690752 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ptjx\" (UniqueName: \"kubernetes.io/projected/50f43b88-4cfe-4cc7-95be-d377136831e4-kube-api-access-6ptjx\") pod \"manila-scheduler-0\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.710166 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.747865 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.776315 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-config\") pod \"dnsmasq-dns-57d6d889f-p9sjj\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.776407 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-dns-svc\") pod \"dnsmasq-dns-57d6d889f-p9sjj\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.776499 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/9a4f028e-3364-435d-8fef-234e98c9b6a1-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.776544 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.776603 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9a4f028e-3364-435d-8fef-234e98c9b6a1-ceph\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.776650 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-dns-swift-storage-0\") pod \"dnsmasq-dns-57d6d889f-p9sjj\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.776680 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-scripts\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.776742 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-ovsdbserver-nb\") pod \"dnsmasq-dns-57d6d889f-p9sjj\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.776775 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvk5r\" (UniqueName: \"kubernetes.io/projected/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-kube-api-access-fvk5r\") pod \"dnsmasq-dns-57d6d889f-p9sjj\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.776819 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a4f028e-3364-435d-8fef-234e98c9b6a1-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.776877 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-config-data\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.776939 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nv27\" (UniqueName: \"kubernetes.io/projected/9a4f028e-3364-435d-8fef-234e98c9b6a1-kube-api-access-8nv27\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.776975 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.777013 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-ovsdbserver-sb\") pod \"dnsmasq-dns-57d6d889f-p9sjj\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.777377 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a4f028e-3364-435d-8fef-234e98c9b6a1-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.777613 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/9a4f028e-3364-435d-8fef-234e98c9b6a1-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.785381 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9a4f028e-3364-435d-8fef-234e98c9b6a1-ceph\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.785487 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.792198 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.793475 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-scripts\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.801808 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-config-data\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.812918 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nv27\" (UniqueName: \"kubernetes.io/projected/9a4f028e-3364-435d-8fef-234e98c9b6a1-kube-api-access-8nv27\") pod \"manila-share-share1-0\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.870529 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.882232 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.897238 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.911466 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.934415 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.952628 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-config\") pod \"dnsmasq-dns-57d6d889f-p9sjj\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.952709 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-dns-svc\") pod \"dnsmasq-dns-57d6d889f-p9sjj\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.957923 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-config\") pod \"dnsmasq-dns-57d6d889f-p9sjj\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.958891 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-dns-swift-storage-0\") pod \"dnsmasq-dns-57d6d889f-p9sjj\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.958977 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-ovsdbserver-nb\") pod \"dnsmasq-dns-57d6d889f-p9sjj\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.959007 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvk5r\" (UniqueName: \"kubernetes.io/projected/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-kube-api-access-fvk5r\") pod \"dnsmasq-dns-57d6d889f-p9sjj\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.959155 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-ovsdbserver-sb\") pod \"dnsmasq-dns-57d6d889f-p9sjj\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.960167 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-ovsdbserver-sb\") pod \"dnsmasq-dns-57d6d889f-p9sjj\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.961051 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-dns-swift-storage-0\") pod \"dnsmasq-dns-57d6d889f-p9sjj\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.963475 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-dns-svc\") pod \"dnsmasq-dns-57d6d889f-p9sjj\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.964866 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-ovsdbserver-nb\") pod \"dnsmasq-dns-57d6d889f-p9sjj\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:41 crc kubenswrapper[4693]: I1204 10:06:41.987162 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvk5r\" (UniqueName: \"kubernetes.io/projected/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-kube-api-access-fvk5r\") pod \"dnsmasq-dns-57d6d889f-p9sjj\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.029496 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.063558 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-logs\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.063633 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-etc-machine-id\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.063659 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-scripts\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.063684 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-config-data-custom\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.063733 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-config-data\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.063812 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.063924 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j88zq\" (UniqueName: \"kubernetes.io/projected/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-kube-api-access-j88zq\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.097523 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.102280 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.179136 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-logs\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.179245 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-etc-machine-id\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.179266 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-scripts\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.179300 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-config-data-custom\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.179438 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-config-data\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.179661 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.179901 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j88zq\" (UniqueName: \"kubernetes.io/projected/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-kube-api-access-j88zq\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.196544 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-etc-machine-id\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.196645 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-logs\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.205197 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-config-data\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.207542 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.212191 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-scripts\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.212802 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-config-data-custom\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.219351 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.232930 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j88zq\" (UniqueName: \"kubernetes.io/projected/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-kube-api-access-j88zq\") pod \"manila-api-0\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.248145 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.407520 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfcdc882-9b7b-4e42-877e-6e8be8597470","Type":"ContainerStarted","Data":"9f5746c3e20a722edac0678dac2df82c3085ae63122fca67ed9712482d7a72bb"} Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.407587 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.422762 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-backup-0" podUID="94261d53-a51c-4f5b-b896-87a957c93c86" containerName="cinder-backup" containerID="cri-o://b9283919629fe583cb70890e2f381584b6304f6d566c5af32dcc4d63ea2d2409" gracePeriod=30 Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.423348 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20a8b38f-6f4c-494b-839f-dfeebb7f043e","Type":"ContainerStarted","Data":"98b4ed8e9da560f0f1abc0f68dac436bbf1c8a319104d7a5378271c9d4c02b0e"} Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.423652 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-volume-volume1-0" podUID="87f05066-df1a-4762-a093-fb4485e060f7" containerName="cinder-volume" containerID="cri-o://9e6a376c8dcf96734ed999bc5935e629a4692f67287dca9f4f2d00d0fa0ef7ec" gracePeriod=30 Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.423799 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-backup-0" podUID="94261d53-a51c-4f5b-b896-87a957c93c86" containerName="probe" containerID="cri-o://7adf8d5297cf0bc240de53308667a77c92a8347958c34b7c249b89d79812196e" gracePeriod=30 Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.423937 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-volume-volume1-0" podUID="87f05066-df1a-4762-a093-fb4485e060f7" containerName="probe" containerID="cri-o://f5ee56828eb494c4043cdb9064307baf016bd4406acabe7e6bb52a3e015d5774" gracePeriod=30 Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.444496 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.444477721 podStartE2EDuration="7.444477721s" podCreationTimestamp="2025-12-04 10:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:06:42.428717515 +0000 UTC m=+1448.326311268" watchObservedRunningTime="2025-12-04 10:06:42.444477721 +0000 UTC m=+1448.342071474" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.532193 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.612798 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.619759 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.729539 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:06:42 crc kubenswrapper[4693]: I1204 10:06:42.961445 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 10:06:43 crc kubenswrapper[4693]: I1204 10:06:43.071646 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d6d889f-p9sjj"] Dec 04 10:06:43 crc kubenswrapper[4693]: I1204 10:06:43.490124 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 04 10:06:43 crc kubenswrapper[4693]: I1204 10:06:43.500711 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20a8b38f-6f4c-494b-839f-dfeebb7f043e","Type":"ContainerStarted","Data":"83fc2196f79f00e80275571d7befb3a3b1859d515a54e853d7ba0a8540eb715e"} Dec 04 10:06:43 crc kubenswrapper[4693]: I1204 10:06:43.512552 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"50f43b88-4cfe-4cc7-95be-d377136831e4","Type":"ContainerStarted","Data":"27315d7eaccc71d103a614b5010c5f56dc82773602872aa8ce62436a51364861"} Dec 04 10:06:43 crc kubenswrapper[4693]: I1204 10:06:43.523399 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"9a4f028e-3364-435d-8fef-234e98c9b6a1","Type":"ContainerStarted","Data":"a982cb378bff05e757241bf4760539dae613ac269980415b97544a6ae411c973"} Dec 04 10:06:43 crc kubenswrapper[4693]: I1204 10:06:43.536555 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" event={"ID":"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0","Type":"ContainerStarted","Data":"09eae557d0b7250ee9025512f376923331c53d44030bc4bca3aa8312f0181d97"} Dec 04 10:06:43 crc kubenswrapper[4693]: I1204 10:06:43.548588 4693 generic.go:334] "Generic (PLEG): container finished" podID="94261d53-a51c-4f5b-b896-87a957c93c86" containerID="7adf8d5297cf0bc240de53308667a77c92a8347958c34b7c249b89d79812196e" exitCode=0 Dec 04 10:06:43 crc kubenswrapper[4693]: I1204 10:06:43.548758 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"94261d53-a51c-4f5b-b896-87a957c93c86","Type":"ContainerDied","Data":"7adf8d5297cf0bc240de53308667a77c92a8347958c34b7c249b89d79812196e"} Dec 04 10:06:43 crc kubenswrapper[4693]: I1204 10:06:43.549656 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="28d812a2-263f-491a-8804-94ab52f3c3c7" containerName="cinder-scheduler" containerID="cri-o://3ff5fde1ea53bacd039524b19e581c1194ed38fa23ab5df3b223ae68087f4abe" gracePeriod=30 Dec 04 10:06:43 crc kubenswrapper[4693]: I1204 10:06:43.550108 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="28d812a2-263f-491a-8804-94ab52f3c3c7" containerName="probe" containerID="cri-o://b5dee9dd056623b3908ef81e03563ef30a66136a6b41eac5fdbfff9a8eb66d62" gracePeriod=30 Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.609222 4693 generic.go:334] "Generic (PLEG): container finished" podID="9b95841c-9b4d-4f37-88f2-2f94dd1b57b0" containerID="ce2830f799dd33c4f53816b70d7c5d5ff87a96c4c429c3ba35f2908a9256933f" exitCode=0 Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.610090 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" event={"ID":"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0","Type":"ContainerDied","Data":"ce2830f799dd33c4f53816b70d7c5d5ff87a96c4c429c3ba35f2908a9256933f"} Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.623650 4693 generic.go:334] "Generic (PLEG): container finished" podID="94261d53-a51c-4f5b-b896-87a957c93c86" containerID="b9283919629fe583cb70890e2f381584b6304f6d566c5af32dcc4d63ea2d2409" exitCode=0 Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.623762 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"94261d53-a51c-4f5b-b896-87a957c93c86","Type":"ContainerDied","Data":"b9283919629fe583cb70890e2f381584b6304f6d566c5af32dcc4d63ea2d2409"} Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.683419 4693 generic.go:334] "Generic (PLEG): container finished" podID="87f05066-df1a-4762-a093-fb4485e060f7" containerID="f5ee56828eb494c4043cdb9064307baf016bd4406acabe7e6bb52a3e015d5774" exitCode=0 Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.683455 4693 generic.go:334] "Generic (PLEG): container finished" podID="87f05066-df1a-4762-a093-fb4485e060f7" containerID="9e6a376c8dcf96734ed999bc5935e629a4692f67287dca9f4f2d00d0fa0ef7ec" exitCode=0 Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.683509 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"87f05066-df1a-4762-a093-fb4485e060f7","Type":"ContainerDied","Data":"f5ee56828eb494c4043cdb9064307baf016bd4406acabe7e6bb52a3e015d5774"} Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.683546 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"87f05066-df1a-4762-a093-fb4485e060f7","Type":"ContainerDied","Data":"9e6a376c8dcf96734ed999bc5935e629a4692f67287dca9f4f2d00d0fa0ef7ec"} Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.721631 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e0dc0bac-6d04-48c3-972d-621b14a3a0d9","Type":"ContainerStarted","Data":"19d3deca871c730ec7f45a785c188a9d62294f067f38dbf4ca998260fda6d6d1"} Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.721672 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e0dc0bac-6d04-48c3-972d-621b14a3a0d9","Type":"ContainerStarted","Data":"1aae7e393102dd3f9c8e3063bfbbf7e22302aa3d56abbf9e0bf033f4080effea"} Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.778880 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.897953 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctnvk\" (UniqueName: \"kubernetes.io/projected/87f05066-df1a-4762-a093-fb4485e060f7-kube-api-access-ctnvk\") pod \"87f05066-df1a-4762-a093-fb4485e060f7\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.898034 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-var-locks-cinder\") pod \"87f05066-df1a-4762-a093-fb4485e060f7\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.898050 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-var-locks-brick\") pod \"87f05066-df1a-4762-a093-fb4485e060f7\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.898111 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/87f05066-df1a-4762-a093-fb4485e060f7-ceph\") pod \"87f05066-df1a-4762-a093-fb4485e060f7\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.898142 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-sys\") pod \"87f05066-df1a-4762-a093-fb4485e060f7\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.898163 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-etc-nvme\") pod \"87f05066-df1a-4762-a093-fb4485e060f7\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.898183 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-etc-machine-id\") pod \"87f05066-df1a-4762-a093-fb4485e060f7\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.898201 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-lib-modules\") pod \"87f05066-df1a-4762-a093-fb4485e060f7\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.898249 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-run\") pod \"87f05066-df1a-4762-a093-fb4485e060f7\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.898295 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-combined-ca-bundle\") pod \"87f05066-df1a-4762-a093-fb4485e060f7\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.898369 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-config-data\") pod \"87f05066-df1a-4762-a093-fb4485e060f7\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.898392 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-dev\") pod \"87f05066-df1a-4762-a093-fb4485e060f7\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.898444 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-etc-iscsi\") pod \"87f05066-df1a-4762-a093-fb4485e060f7\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.898473 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-scripts\") pod \"87f05066-df1a-4762-a093-fb4485e060f7\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.898506 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-var-lib-cinder\") pod \"87f05066-df1a-4762-a093-fb4485e060f7\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.898526 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-config-data-custom\") pod \"87f05066-df1a-4762-a093-fb4485e060f7\" (UID: \"87f05066-df1a-4762-a093-fb4485e060f7\") " Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.898589 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "87f05066-df1a-4762-a093-fb4485e060f7" (UID: "87f05066-df1a-4762-a093-fb4485e060f7"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.898653 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "87f05066-df1a-4762-a093-fb4485e060f7" (UID: "87f05066-df1a-4762-a093-fb4485e060f7"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.898680 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "87f05066-df1a-4762-a093-fb4485e060f7" (UID: "87f05066-df1a-4762-a093-fb4485e060f7"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.900520 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-dev" (OuterVolumeSpecName: "dev") pod "87f05066-df1a-4762-a093-fb4485e060f7" (UID: "87f05066-df1a-4762-a093-fb4485e060f7"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.900618 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-run" (OuterVolumeSpecName: "run") pod "87f05066-df1a-4762-a093-fb4485e060f7" (UID: "87f05066-df1a-4762-a093-fb4485e060f7"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.902454 4693 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.902488 4693 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.902499 4693 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.902510 4693 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-run\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.902521 4693 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-dev\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.903629 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "87f05066-df1a-4762-a093-fb4485e060f7" (UID: "87f05066-df1a-4762-a093-fb4485e060f7"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.907345 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "87f05066-df1a-4762-a093-fb4485e060f7" (UID: "87f05066-df1a-4762-a093-fb4485e060f7"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.914916 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "87f05066-df1a-4762-a093-fb4485e060f7" (UID: "87f05066-df1a-4762-a093-fb4485e060f7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.915966 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "87f05066-df1a-4762-a093-fb4485e060f7" (UID: "87f05066-df1a-4762-a093-fb4485e060f7"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.916007 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-sys" (OuterVolumeSpecName: "sys") pod "87f05066-df1a-4762-a093-fb4485e060f7" (UID: "87f05066-df1a-4762-a093-fb4485e060f7"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.916038 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "87f05066-df1a-4762-a093-fb4485e060f7" (UID: "87f05066-df1a-4762-a093-fb4485e060f7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.918494 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87f05066-df1a-4762-a093-fb4485e060f7-kube-api-access-ctnvk" (OuterVolumeSpecName: "kube-api-access-ctnvk") pod "87f05066-df1a-4762-a093-fb4485e060f7" (UID: "87f05066-df1a-4762-a093-fb4485e060f7"). InnerVolumeSpecName "kube-api-access-ctnvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.922666 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-scripts" (OuterVolumeSpecName: "scripts") pod "87f05066-df1a-4762-a093-fb4485e060f7" (UID: "87f05066-df1a-4762-a093-fb4485e060f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.956689 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87f05066-df1a-4762-a093-fb4485e060f7-ceph" (OuterVolumeSpecName: "ceph") pod "87f05066-df1a-4762-a093-fb4485e060f7" (UID: "87f05066-df1a-4762-a093-fb4485e060f7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:44 crc kubenswrapper[4693]: I1204 10:06:44.986637 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.008082 4693 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.009179 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.009191 4693 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.009203 4693 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.009216 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctnvk\" (UniqueName: \"kubernetes.io/projected/87f05066-df1a-4762-a093-fb4485e060f7-kube-api-access-ctnvk\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.009228 4693 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/87f05066-df1a-4762-a093-fb4485e060f7-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.009239 4693 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-sys\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.009251 4693 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.009262 4693 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/87f05066-df1a-4762-a093-fb4485e060f7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.033526 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87f05066-df1a-4762-a093-fb4485e060f7" (UID: "87f05066-df1a-4762-a093-fb4485e060f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.110484 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-var-locks-brick\") pod \"94261d53-a51c-4f5b-b896-87a957c93c86\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.110546 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-etc-iscsi\") pod \"94261d53-a51c-4f5b-b896-87a957c93c86\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.110565 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-etc-machine-id\") pod \"94261d53-a51c-4f5b-b896-87a957c93c86\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.110658 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-scripts\") pod \"94261d53-a51c-4f5b-b896-87a957c93c86\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.110908 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4c7b\" (UniqueName: \"kubernetes.io/projected/94261d53-a51c-4f5b-b896-87a957c93c86-kube-api-access-r4c7b\") pod \"94261d53-a51c-4f5b-b896-87a957c93c86\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.111069 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-etc-nvme\") pod \"94261d53-a51c-4f5b-b896-87a957c93c86\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.111096 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-config-data-custom\") pod \"94261d53-a51c-4f5b-b896-87a957c93c86\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.111122 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-var-locks-cinder\") pod \"94261d53-a51c-4f5b-b896-87a957c93c86\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.111151 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/94261d53-a51c-4f5b-b896-87a957c93c86-ceph\") pod \"94261d53-a51c-4f5b-b896-87a957c93c86\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.111184 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-combined-ca-bundle\") pod \"94261d53-a51c-4f5b-b896-87a957c93c86\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.111210 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-dev\") pod \"94261d53-a51c-4f5b-b896-87a957c93c86\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.111226 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-lib-modules\") pod \"94261d53-a51c-4f5b-b896-87a957c93c86\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.111268 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-var-lib-cinder\") pod \"94261d53-a51c-4f5b-b896-87a957c93c86\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.111341 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-run\") pod \"94261d53-a51c-4f5b-b896-87a957c93c86\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.111437 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-config-data\") pod \"94261d53-a51c-4f5b-b896-87a957c93c86\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.111474 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-sys\") pod \"94261d53-a51c-4f5b-b896-87a957c93c86\" (UID: \"94261d53-a51c-4f5b-b896-87a957c93c86\") " Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.112266 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "94261d53-a51c-4f5b-b896-87a957c93c86" (UID: "94261d53-a51c-4f5b-b896-87a957c93c86"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.112323 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "94261d53-a51c-4f5b-b896-87a957c93c86" (UID: "94261d53-a51c-4f5b-b896-87a957c93c86"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.112365 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "94261d53-a51c-4f5b-b896-87a957c93c86" (UID: "94261d53-a51c-4f5b-b896-87a957c93c86"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.112385 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "94261d53-a51c-4f5b-b896-87a957c93c86" (UID: "94261d53-a51c-4f5b-b896-87a957c93c86"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.113352 4693 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.113377 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.113389 4693 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.113398 4693 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.113407 4693 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.113453 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-sys" (OuterVolumeSpecName: "sys") pod "94261d53-a51c-4f5b-b896-87a957c93c86" (UID: "94261d53-a51c-4f5b-b896-87a957c93c86"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.114395 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "94261d53-a51c-4f5b-b896-87a957c93c86" (UID: "94261d53-a51c-4f5b-b896-87a957c93c86"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.115619 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-run" (OuterVolumeSpecName: "run") pod "94261d53-a51c-4f5b-b896-87a957c93c86" (UID: "94261d53-a51c-4f5b-b896-87a957c93c86"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.115678 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "94261d53-a51c-4f5b-b896-87a957c93c86" (UID: "94261d53-a51c-4f5b-b896-87a957c93c86"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.115708 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-dev" (OuterVolumeSpecName: "dev") pod "94261d53-a51c-4f5b-b896-87a957c93c86" (UID: "94261d53-a51c-4f5b-b896-87a957c93c86"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.115751 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "94261d53-a51c-4f5b-b896-87a957c93c86" (UID: "94261d53-a51c-4f5b-b896-87a957c93c86"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.126126 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-scripts" (OuterVolumeSpecName: "scripts") pod "94261d53-a51c-4f5b-b896-87a957c93c86" (UID: "94261d53-a51c-4f5b-b896-87a957c93c86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.127767 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94261d53-a51c-4f5b-b896-87a957c93c86-kube-api-access-r4c7b" (OuterVolumeSpecName: "kube-api-access-r4c7b") pod "94261d53-a51c-4f5b-b896-87a957c93c86" (UID: "94261d53-a51c-4f5b-b896-87a957c93c86"). InnerVolumeSpecName "kube-api-access-r4c7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.128608 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "94261d53-a51c-4f5b-b896-87a957c93c86" (UID: "94261d53-a51c-4f5b-b896-87a957c93c86"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.136440 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94261d53-a51c-4f5b-b896-87a957c93c86-ceph" (OuterVolumeSpecName: "ceph") pod "94261d53-a51c-4f5b-b896-87a957c93c86" (UID: "94261d53-a51c-4f5b-b896-87a957c93c86"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.217306 4693 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-sys\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.217370 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.217383 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4c7b\" (UniqueName: \"kubernetes.io/projected/94261d53-a51c-4f5b-b896-87a957c93c86-kube-api-access-r4c7b\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.217394 4693 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.217406 4693 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.217415 4693 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/94261d53-a51c-4f5b-b896-87a957c93c86-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.217427 4693 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-dev\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.217437 4693 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.217446 4693 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.217462 4693 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/94261d53-a51c-4f5b-b896-87a957c93c86-run\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.222309 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-config-data" (OuterVolumeSpecName: "config-data") pod "87f05066-df1a-4762-a093-fb4485e060f7" (UID: "87f05066-df1a-4762-a093-fb4485e060f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.304973 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94261d53-a51c-4f5b-b896-87a957c93c86" (UID: "94261d53-a51c-4f5b-b896-87a957c93c86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.322980 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.323031 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f05066-df1a-4762-a093-fb4485e060f7-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.414404 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-config-data" (OuterVolumeSpecName: "config-data") pod "94261d53-a51c-4f5b-b896-87a957c93c86" (UID: "94261d53-a51c-4f5b-b896-87a957c93c86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.426702 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94261d53-a51c-4f5b-b896-87a957c93c86-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.738529 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.738463 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"94261d53-a51c-4f5b-b896-87a957c93c86","Type":"ContainerDied","Data":"0ba7de7b0cade5d2d370d91c506946c78a9f8a8ff167f7846320acbbedf2e9a9"} Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.739593 4693 scope.go:117] "RemoveContainer" containerID="7adf8d5297cf0bc240de53308667a77c92a8347958c34b7c249b89d79812196e" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.776085 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"87f05066-df1a-4762-a093-fb4485e060f7","Type":"ContainerDied","Data":"811a1e106b95c67b632e7e7fb20e5b023bdaca7fa2019a8c3e486316e607a724"} Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.776241 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.791914 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20a8b38f-6f4c-494b-839f-dfeebb7f043e","Type":"ContainerStarted","Data":"7f71f61b858d85d2bd793465730914b8ad36a5e837e13e1074182ce9eff14b24"} Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.793276 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.806499 4693 generic.go:334] "Generic (PLEG): container finished" podID="28d812a2-263f-491a-8804-94ab52f3c3c7" containerID="b5dee9dd056623b3908ef81e03563ef30a66136a6b41eac5fdbfff9a8eb66d62" exitCode=0 Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.806551 4693 generic.go:334] "Generic (PLEG): container finished" podID="28d812a2-263f-491a-8804-94ab52f3c3c7" containerID="3ff5fde1ea53bacd039524b19e581c1194ed38fa23ab5df3b223ae68087f4abe" exitCode=0 Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.806657 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"28d812a2-263f-491a-8804-94ab52f3c3c7","Type":"ContainerDied","Data":"b5dee9dd056623b3908ef81e03563ef30a66136a6b41eac5fdbfff9a8eb66d62"} Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.806693 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"28d812a2-263f-491a-8804-94ab52f3c3c7","Type":"ContainerDied","Data":"3ff5fde1ea53bacd039524b19e581c1194ed38fa23ab5df3b223ae68087f4abe"} Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.809779 4693 scope.go:117] "RemoveContainer" containerID="b9283919629fe583cb70890e2f381584b6304f6d566c5af32dcc4d63ea2d2409" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.811069 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"50f43b88-4cfe-4cc7-95be-d377136831e4","Type":"ContainerStarted","Data":"7ed77c5fe0e00b14fc45c32c458ac5b7d9703c256ba2bb01f5ade60a906cb7c7"} Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.836978 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.844424 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" event={"ID":"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0","Type":"ContainerStarted","Data":"c058f80eb5adf4299d6a8581297037c3ced8024132b42dca5de7d92308e78470"} Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.847534 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.861534 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.898088 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 04 10:06:45 crc kubenswrapper[4693]: E1204 10:06:45.898807 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94261d53-a51c-4f5b-b896-87a957c93c86" containerName="probe" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.898829 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="94261d53-a51c-4f5b-b896-87a957c93c86" containerName="probe" Dec 04 10:06:45 crc kubenswrapper[4693]: E1204 10:06:45.898840 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f05066-df1a-4762-a093-fb4485e060f7" containerName="probe" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.898849 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f05066-df1a-4762-a093-fb4485e060f7" containerName="probe" Dec 04 10:06:45 crc kubenswrapper[4693]: E1204 10:06:45.898864 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f05066-df1a-4762-a093-fb4485e060f7" containerName="cinder-volume" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.898871 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f05066-df1a-4762-a093-fb4485e060f7" containerName="cinder-volume" Dec 04 10:06:45 crc kubenswrapper[4693]: E1204 10:06:45.898895 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94261d53-a51c-4f5b-b896-87a957c93c86" containerName="cinder-backup" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.898901 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="94261d53-a51c-4f5b-b896-87a957c93c86" containerName="cinder-backup" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.899113 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="94261d53-a51c-4f5b-b896-87a957c93c86" containerName="probe" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.899129 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="94261d53-a51c-4f5b-b896-87a957c93c86" containerName="cinder-backup" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.899140 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="87f05066-df1a-4762-a093-fb4485e060f7" containerName="cinder-volume" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.899150 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="87f05066-df1a-4762-a093-fb4485e060f7" containerName="probe" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.900514 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.904162 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.909584 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.24481035 podStartE2EDuration="10.909554364s" podCreationTimestamp="2025-12-04 10:06:35 +0000 UTC" firstStartedPulling="2025-12-04 10:06:37.164736936 +0000 UTC m=+1443.062330689" lastFinishedPulling="2025-12-04 10:06:44.82948095 +0000 UTC m=+1450.727074703" observedRunningTime="2025-12-04 10:06:45.836674492 +0000 UTC m=+1451.734268245" watchObservedRunningTime="2025-12-04 10:06:45.909554364 +0000 UTC m=+1451.807148117" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.929235 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.936677 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" podStartSLOduration=4.936653293 podStartE2EDuration="4.936653293s" podCreationTimestamp="2025-12-04 10:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:06:45.890514489 +0000 UTC m=+1451.788108242" watchObservedRunningTime="2025-12-04 10:06:45.936653293 +0000 UTC m=+1451.834247046" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.957408 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.972456 4693 scope.go:117] "RemoveContainer" containerID="f5ee56828eb494c4043cdb9064307baf016bd4406acabe7e6bb52a3e015d5774" Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.973526 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 10:06:45 crc kubenswrapper[4693]: I1204 10:06:45.991713 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.045774 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 10:06:46 crc kubenswrapper[4693]: E1204 10:06:46.046240 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d812a2-263f-491a-8804-94ab52f3c3c7" containerName="probe" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.046265 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d812a2-263f-491a-8804-94ab52f3c3c7" containerName="probe" Dec 04 10:06:46 crc kubenswrapper[4693]: E1204 10:06:46.046290 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d812a2-263f-491a-8804-94ab52f3c3c7" containerName="cinder-scheduler" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.046298 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d812a2-263f-491a-8804-94ab52f3c3c7" containerName="cinder-scheduler" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.046510 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d812a2-263f-491a-8804-94ab52f3c3c7" containerName="cinder-scheduler" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.046548 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d812a2-263f-491a-8804-94ab52f3c3c7" containerName="probe" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.047643 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.050675 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.051583 4693 scope.go:117] "RemoveContainer" containerID="9e6a376c8dcf96734ed999bc5935e629a4692f67287dca9f4f2d00d0fa0ef7ec" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.056183 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-scripts\") pod \"28d812a2-263f-491a-8804-94ab52f3c3c7\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.056265 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-config-data\") pod \"28d812a2-263f-491a-8804-94ab52f3c3c7\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.056287 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gslmg\" (UniqueName: \"kubernetes.io/projected/28d812a2-263f-491a-8804-94ab52f3c3c7-kube-api-access-gslmg\") pod \"28d812a2-263f-491a-8804-94ab52f3c3c7\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.066558 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-combined-ca-bundle\") pod \"28d812a2-263f-491a-8804-94ab52f3c3c7\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.066619 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-config-data-custom\") pod \"28d812a2-263f-491a-8804-94ab52f3c3c7\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.066720 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28d812a2-263f-491a-8804-94ab52f3c3c7-etc-machine-id\") pod \"28d812a2-263f-491a-8804-94ab52f3c3c7\" (UID: \"28d812a2-263f-491a-8804-94ab52f3c3c7\") " Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.067421 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.067446 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-run\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.067503 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.067540 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5c459b-21a7-4799-a516-2a270de6e246-config-data\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.067559 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-lib-modules\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.067784 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28d812a2-263f-491a-8804-94ab52f3c3c7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "28d812a2-263f-491a-8804-94ab52f3c3c7" (UID: "28d812a2-263f-491a-8804-94ab52f3c3c7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.075927 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28d812a2-263f-491a-8804-94ab52f3c3c7-kube-api-access-gslmg" (OuterVolumeSpecName: "kube-api-access-gslmg") pod "28d812a2-263f-491a-8804-94ab52f3c3c7" (UID: "28d812a2-263f-491a-8804-94ab52f3c3c7"). InnerVolumeSpecName "kube-api-access-gslmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.076575 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "28d812a2-263f-491a-8804-94ab52f3c3c7" (UID: "28d812a2-263f-491a-8804-94ab52f3c3c7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.077518 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5c459b-21a7-4799-a516-2a270de6e246-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.078155 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-sys\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.078175 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca5c459b-21a7-4799-a516-2a270de6e246-config-data-custom\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.078210 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca5c459b-21a7-4799-a516-2a270de6e246-scripts\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.078250 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-etc-nvme\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.078270 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.078311 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.078356 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ca5c459b-21a7-4799-a516-2a270de6e246-ceph\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.078380 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-dev\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.078636 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.078817 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf97w\" (UniqueName: \"kubernetes.io/projected/ca5c459b-21a7-4799-a516-2a270de6e246-kube-api-access-kf97w\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.078985 4693 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.079012 4693 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28d812a2-263f-491a-8804-94ab52f3c3c7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.079027 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gslmg\" (UniqueName: \"kubernetes.io/projected/28d812a2-263f-491a-8804-94ab52f3c3c7-kube-api-access-gslmg\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.083837 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.086549 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-scripts" (OuterVolumeSpecName: "scripts") pod "28d812a2-263f-491a-8804-94ab52f3c3c7" (UID: "28d812a2-263f-491a-8804-94ab52f3c3c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.153345 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28d812a2-263f-491a-8804-94ab52f3c3c7" (UID: "28d812a2-263f-491a-8804-94ab52f3c3c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.181513 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-etc-nvme\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.181565 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.181601 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.181625 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.181649 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.181669 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-dev\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.181686 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ca5c459b-21a7-4799-a516-2a270de6e246-ceph\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.181707 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-dev\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.181729 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.181758 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/711f27ff-01df-4851-bc37-a7115b5fa624-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.181748 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-etc-nvme\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.181799 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-sys\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.181861 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.181879 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.181880 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hhx2\" (UniqueName: \"kubernetes.io/projected/711f27ff-01df-4851-bc37-a7115b5fa624-kube-api-access-9hhx2\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.181977 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.181982 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182048 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-dev\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182093 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182118 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/711f27ff-01df-4851-bc37-a7115b5fa624-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182153 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/711f27ff-01df-4851-bc37-a7115b5fa624-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182186 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf97w\" (UniqueName: \"kubernetes.io/projected/ca5c459b-21a7-4799-a516-2a270de6e246-kube-api-access-kf97w\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182248 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-run\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182282 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182393 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182429 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5c459b-21a7-4799-a516-2a270de6e246-config-data\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182454 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-lib-modules\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182503 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5c459b-21a7-4799-a516-2a270de6e246-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182537 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182546 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182562 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182579 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-run\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182622 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-sys\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182643 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca5c459b-21a7-4799-a516-2a270de6e246-config-data-custom\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182674 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/711f27ff-01df-4851-bc37-a7115b5fa624-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182700 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/711f27ff-01df-4851-bc37-a7115b5fa624-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182717 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-run\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182737 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca5c459b-21a7-4799-a516-2a270de6e246-scripts\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182751 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182823 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182848 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182888 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.182913 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-sys\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.185714 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ca5c459b-21a7-4799-a516-2a270de6e246-lib-modules\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.188723 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca5c459b-21a7-4799-a516-2a270de6e246-scripts\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.189371 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5c459b-21a7-4799-a516-2a270de6e246-config-data\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.189969 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5c459b-21a7-4799-a516-2a270de6e246-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.191126 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca5c459b-21a7-4799-a516-2a270de6e246-config-data-custom\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.193837 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ca5c459b-21a7-4799-a516-2a270de6e246-ceph\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.208025 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf97w\" (UniqueName: \"kubernetes.io/projected/ca5c459b-21a7-4799-a516-2a270de6e246-kube-api-access-kf97w\") pod \"cinder-backup-0\" (UID: \"ca5c459b-21a7-4799-a516-2a270de6e246\") " pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.221402 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-config-data" (OuterVolumeSpecName: "config-data") pod "28d812a2-263f-491a-8804-94ab52f3c3c7" (UID: "28d812a2-263f-491a-8804-94ab52f3c3c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.284043 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/711f27ff-01df-4851-bc37-a7115b5fa624-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.284138 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-sys\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.284181 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hhx2\" (UniqueName: \"kubernetes.io/projected/711f27ff-01df-4851-bc37-a7115b5fa624-kube-api-access-9hhx2\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.284213 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.284253 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/711f27ff-01df-4851-bc37-a7115b5fa624-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.284272 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/711f27ff-01df-4851-bc37-a7115b5fa624-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.284350 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.284375 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.284414 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/711f27ff-01df-4851-bc37-a7115b5fa624-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.284446 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/711f27ff-01df-4851-bc37-a7115b5fa624-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.284476 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-run\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.284504 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.284596 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.284631 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.284657 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-dev\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.284688 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.284765 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d812a2-263f-491a-8804-94ab52f3c3c7-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.284828 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.285559 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.285666 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-sys\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.286014 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.286507 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.286596 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.286645 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.286675 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-dev\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.286691 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.286729 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/711f27ff-01df-4851-bc37-a7115b5fa624-run\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.289378 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/711f27ff-01df-4851-bc37-a7115b5fa624-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.289680 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.290606 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/711f27ff-01df-4851-bc37-a7115b5fa624-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.294893 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/711f27ff-01df-4851-bc37-a7115b5fa624-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.295368 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/711f27ff-01df-4851-bc37-a7115b5fa624-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.306113 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/711f27ff-01df-4851-bc37-a7115b5fa624-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.314176 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hhx2\" (UniqueName: \"kubernetes.io/projected/711f27ff-01df-4851-bc37-a7115b5fa624-kube-api-access-9hhx2\") pod \"cinder-volume-volume1-0\" (UID: \"711f27ff-01df-4851-bc37-a7115b5fa624\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.395031 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.411909 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.515660 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87f05066-df1a-4762-a093-fb4485e060f7" path="/var/lib/kubelet/pods/87f05066-df1a-4762-a093-fb4485e060f7/volumes" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.517875 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94261d53-a51c-4f5b-b896-87a957c93c86" path="/var/lib/kubelet/pods/94261d53-a51c-4f5b-b896-87a957c93c86/volumes" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.925377 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"28d812a2-263f-491a-8804-94ab52f3c3c7","Type":"ContainerDied","Data":"580f7b6b02054c94c6d9446f518b09b822a88a1dc47d5a2e4cacbc06d5a28789"} Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.925925 4693 scope.go:117] "RemoveContainer" containerID="b5dee9dd056623b3908ef81e03563ef30a66136a6b41eac5fdbfff9a8eb66d62" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.926177 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.956856 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e0dc0bac-6d04-48c3-972d-621b14a3a0d9","Type":"ContainerStarted","Data":"c1a65b68919f6b99e0d561363427d858c224a65bbd13ca1d92f6fdc533a4dbee"} Dec 04 10:06:46 crc kubenswrapper[4693]: I1204 10:06:46.957487 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.000666 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"50f43b88-4cfe-4cc7-95be-d377136831e4","Type":"ContainerStarted","Data":"74e817bed32c958cc6ae1123f29c40f1f24cac3f4a4c62bd4357bac0a3d0f2d2"} Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.012487 4693 scope.go:117] "RemoveContainer" containerID="3ff5fde1ea53bacd039524b19e581c1194ed38fa23ab5df3b223ae68087f4abe" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.038153 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.055869 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.068412 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 10:06:47 crc kubenswrapper[4693]: W1204 10:06:47.068514 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca5c459b_21a7_4799_a516_2a270de6e246.slice/crio-baec1729f2bfae9953376fe3e278e1d7d85379c722e5229a68b17998bc19aa86 WatchSource:0}: Error finding container baec1729f2bfae9953376fe3e278e1d7d85379c722e5229a68b17998bc19aa86: Status 404 returned error can't find the container with id baec1729f2bfae9953376fe3e278e1d7d85379c722e5229a68b17998bc19aa86 Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.080666 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.083441 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.092999 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.093785 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.104912 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=6.104880412 podStartE2EDuration="6.104880412s" podCreationTimestamp="2025-12-04 10:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:06:47.022815686 +0000 UTC m=+1452.920409439" watchObservedRunningTime="2025-12-04 10:06:47.104880412 +0000 UTC m=+1453.002474165" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.113465 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=5.230439225 podStartE2EDuration="6.113437339s" podCreationTimestamp="2025-12-04 10:06:41 +0000 UTC" firstStartedPulling="2025-12-04 10:06:42.702934087 +0000 UTC m=+1448.600527840" lastFinishedPulling="2025-12-04 10:06:43.585932201 +0000 UTC m=+1449.483525954" observedRunningTime="2025-12-04 10:06:47.068376715 +0000 UTC m=+1452.965970458" watchObservedRunningTime="2025-12-04 10:06:47.113437339 +0000 UTC m=+1453.011031092" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.126565 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487df7df-e43a-48a6-8350-6b9804d13e39-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"487df7df-e43a-48a6-8350-6b9804d13e39\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.126639 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487df7df-e43a-48a6-8350-6b9804d13e39-config-data\") pod \"cinder-scheduler-0\" (UID: \"487df7df-e43a-48a6-8350-6b9804d13e39\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.126673 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7t52\" (UniqueName: \"kubernetes.io/projected/487df7df-e43a-48a6-8350-6b9804d13e39-kube-api-access-v7t52\") pod \"cinder-scheduler-0\" (UID: \"487df7df-e43a-48a6-8350-6b9804d13e39\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.126705 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/487df7df-e43a-48a6-8350-6b9804d13e39-scripts\") pod \"cinder-scheduler-0\" (UID: \"487df7df-e43a-48a6-8350-6b9804d13e39\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.126750 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/487df7df-e43a-48a6-8350-6b9804d13e39-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"487df7df-e43a-48a6-8350-6b9804d13e39\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.126795 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/487df7df-e43a-48a6-8350-6b9804d13e39-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"487df7df-e43a-48a6-8350-6b9804d13e39\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.229875 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/487df7df-e43a-48a6-8350-6b9804d13e39-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"487df7df-e43a-48a6-8350-6b9804d13e39\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.229981 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487df7df-e43a-48a6-8350-6b9804d13e39-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"487df7df-e43a-48a6-8350-6b9804d13e39\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.230027 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487df7df-e43a-48a6-8350-6b9804d13e39-config-data\") pod \"cinder-scheduler-0\" (UID: \"487df7df-e43a-48a6-8350-6b9804d13e39\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.230057 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7t52\" (UniqueName: \"kubernetes.io/projected/487df7df-e43a-48a6-8350-6b9804d13e39-kube-api-access-v7t52\") pod \"cinder-scheduler-0\" (UID: \"487df7df-e43a-48a6-8350-6b9804d13e39\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.230102 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/487df7df-e43a-48a6-8350-6b9804d13e39-scripts\") pod \"cinder-scheduler-0\" (UID: \"487df7df-e43a-48a6-8350-6b9804d13e39\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.230152 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/487df7df-e43a-48a6-8350-6b9804d13e39-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"487df7df-e43a-48a6-8350-6b9804d13e39\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.230485 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/487df7df-e43a-48a6-8350-6b9804d13e39-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"487df7df-e43a-48a6-8350-6b9804d13e39\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.235320 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/487df7df-e43a-48a6-8350-6b9804d13e39-scripts\") pod \"cinder-scheduler-0\" (UID: \"487df7df-e43a-48a6-8350-6b9804d13e39\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.235474 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487df7df-e43a-48a6-8350-6b9804d13e39-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"487df7df-e43a-48a6-8350-6b9804d13e39\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.235896 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/487df7df-e43a-48a6-8350-6b9804d13e39-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"487df7df-e43a-48a6-8350-6b9804d13e39\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.240450 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487df7df-e43a-48a6-8350-6b9804d13e39-config-data\") pod \"cinder-scheduler-0\" (UID: \"487df7df-e43a-48a6-8350-6b9804d13e39\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.248453 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7t52\" (UniqueName: \"kubernetes.io/projected/487df7df-e43a-48a6-8350-6b9804d13e39-kube-api-access-v7t52\") pod \"cinder-scheduler-0\" (UID: \"487df7df-e43a-48a6-8350-6b9804d13e39\") " pod="openstack/cinder-scheduler-0" Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.282047 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 10:06:47 crc kubenswrapper[4693]: I1204 10:06:47.510147 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 10:06:48 crc kubenswrapper[4693]: I1204 10:06:48.035039 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"ca5c459b-21a7-4799-a516-2a270de6e246","Type":"ContainerStarted","Data":"0b5b8ae689d8fbb17e5e71a26c56dcf4cf6e1d7106e0ef0a2d1266b71a907f06"} Dec 04 10:06:48 crc kubenswrapper[4693]: I1204 10:06:48.035680 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"ca5c459b-21a7-4799-a516-2a270de6e246","Type":"ContainerStarted","Data":"baec1729f2bfae9953376fe3e278e1d7d85379c722e5229a68b17998bc19aa86"} Dec 04 10:06:48 crc kubenswrapper[4693]: I1204 10:06:48.039003 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="e0dc0bac-6d04-48c3-972d-621b14a3a0d9" containerName="manila-api-log" containerID="cri-o://19d3deca871c730ec7f45a785c188a9d62294f067f38dbf4ca998260fda6d6d1" gracePeriod=30 Dec 04 10:06:48 crc kubenswrapper[4693]: I1204 10:06:48.039286 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"711f27ff-01df-4851-bc37-a7115b5fa624","Type":"ContainerStarted","Data":"faaa9cb0087f0817418ccbbbf6e8351a5938a1e47cc3571b07d9e7f8652f53d7"} Dec 04 10:06:48 crc kubenswrapper[4693]: I1204 10:06:48.039309 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"711f27ff-01df-4851-bc37-a7115b5fa624","Type":"ContainerStarted","Data":"29ece8520143f3648de678426d6d47c6ffc7599959f7641ec1d9972e7574618a"} Dec 04 10:06:48 crc kubenswrapper[4693]: I1204 10:06:48.040984 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="e0dc0bac-6d04-48c3-972d-621b14a3a0d9" containerName="manila-api" containerID="cri-o://c1a65b68919f6b99e0d561363427d858c224a65bbd13ca1d92f6fdc533a4dbee" gracePeriod=30 Dec 04 10:06:48 crc kubenswrapper[4693]: I1204 10:06:48.204168 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:06:48 crc kubenswrapper[4693]: W1204 10:06:48.236018 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod487df7df_e43a_48a6_8350_6b9804d13e39.slice/crio-82fa41cbd59488cf1560a1eaa9988a7abb4857bb13e8e4aacd736436542794e0 WatchSource:0}: Error finding container 82fa41cbd59488cf1560a1eaa9988a7abb4857bb13e8e4aacd736436542794e0: Status 404 returned error can't find the container with id 82fa41cbd59488cf1560a1eaa9988a7abb4857bb13e8e4aacd736436542794e0 Dec 04 10:06:48 crc kubenswrapper[4693]: I1204 10:06:48.504466 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28d812a2-263f-491a-8804-94ab52f3c3c7" path="/var/lib/kubelet/pods/28d812a2-263f-491a-8804-94ab52f3c3c7/volumes" Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.061264 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"487df7df-e43a-48a6-8350-6b9804d13e39","Type":"ContainerStarted","Data":"82fa41cbd59488cf1560a1eaa9988a7abb4857bb13e8e4aacd736436542794e0"} Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.131077 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"711f27ff-01df-4851-bc37-a7115b5fa624","Type":"ContainerStarted","Data":"73f8b1911a19659c87c1675f19759bfec88c61a9f266e9451a07ffbc0159fe50"} Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.172698 4693 generic.go:334] "Generic (PLEG): container finished" podID="e0dc0bac-6d04-48c3-972d-621b14a3a0d9" containerID="c1a65b68919f6b99e0d561363427d858c224a65bbd13ca1d92f6fdc533a4dbee" exitCode=0 Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.173064 4693 generic.go:334] "Generic (PLEG): container finished" podID="e0dc0bac-6d04-48c3-972d-621b14a3a0d9" containerID="19d3deca871c730ec7f45a785c188a9d62294f067f38dbf4ca998260fda6d6d1" exitCode=143 Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.172898 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e0dc0bac-6d04-48c3-972d-621b14a3a0d9","Type":"ContainerDied","Data":"c1a65b68919f6b99e0d561363427d858c224a65bbd13ca1d92f6fdc533a4dbee"} Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.173291 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e0dc0bac-6d04-48c3-972d-621b14a3a0d9","Type":"ContainerDied","Data":"19d3deca871c730ec7f45a785c188a9d62294f067f38dbf4ca998260fda6d6d1"} Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.173578 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e0dc0bac-6d04-48c3-972d-621b14a3a0d9","Type":"ContainerDied","Data":"1aae7e393102dd3f9c8e3063bfbbf7e22302aa3d56abbf9e0bf033f4080effea"} Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.173667 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1aae7e393102dd3f9c8e3063bfbbf7e22302aa3d56abbf9e0bf033f4080effea" Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.192275 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.196893 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=4.19686135 podStartE2EDuration="4.19686135s" podCreationTimestamp="2025-12-04 10:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:06:49.182921565 +0000 UTC m=+1455.080515318" watchObservedRunningTime="2025-12-04 10:06:49.19686135 +0000 UTC m=+1455.094455103" Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.200028 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"ca5c459b-21a7-4799-a516-2a270de6e246","Type":"ContainerStarted","Data":"e85b896145913a6074aba563ddb96e5f0f3bb3cfc77cb0785a77f3ea0a949b65"} Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.235076 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-combined-ca-bundle\") pod \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.235162 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-config-data\") pod \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.235473 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j88zq\" (UniqueName: \"kubernetes.io/projected/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-kube-api-access-j88zq\") pod \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.235568 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-logs\") pod \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.235600 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-scripts\") pod \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.235636 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-etc-machine-id\") pod \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.235761 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-config-data-custom\") pod \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\" (UID: \"e0dc0bac-6d04-48c3-972d-621b14a3a0d9\") " Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.237281 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e0dc0bac-6d04-48c3-972d-621b14a3a0d9" (UID: "e0dc0bac-6d04-48c3-972d-621b14a3a0d9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.259617 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-logs" (OuterVolumeSpecName: "logs") pod "e0dc0bac-6d04-48c3-972d-621b14a3a0d9" (UID: "e0dc0bac-6d04-48c3-972d-621b14a3a0d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.281094 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-scripts" (OuterVolumeSpecName: "scripts") pod "e0dc0bac-6d04-48c3-972d-621b14a3a0d9" (UID: "e0dc0bac-6d04-48c3-972d-621b14a3a0d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.321902 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-kube-api-access-j88zq" (OuterVolumeSpecName: "kube-api-access-j88zq") pod "e0dc0bac-6d04-48c3-972d-621b14a3a0d9" (UID: "e0dc0bac-6d04-48c3-972d-621b14a3a0d9"). InnerVolumeSpecName "kube-api-access-j88zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.329513 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e0dc0bac-6d04-48c3-972d-621b14a3a0d9" (UID: "e0dc0bac-6d04-48c3-972d-621b14a3a0d9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.330871 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=4.330846921 podStartE2EDuration="4.330846921s" podCreationTimestamp="2025-12-04 10:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:06:49.271952214 +0000 UTC m=+1455.169545967" watchObservedRunningTime="2025-12-04 10:06:49.330846921 +0000 UTC m=+1455.228440674" Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.348768 4693 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.348802 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j88zq\" (UniqueName: \"kubernetes.io/projected/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-kube-api-access-j88zq\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.348813 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.348821 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.348830 4693 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.408854 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-config-data" (OuterVolumeSpecName: "config-data") pod "e0dc0bac-6d04-48c3-972d-621b14a3a0d9" (UID: "e0dc0bac-6d04-48c3-972d-621b14a3a0d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.427455 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0dc0bac-6d04-48c3-972d-621b14a3a0d9" (UID: "e0dc0bac-6d04-48c3-972d-621b14a3a0d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.454095 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:49 crc kubenswrapper[4693]: I1204 10:06:49.454144 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0dc0bac-6d04-48c3-972d-621b14a3a0d9-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.223410 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"487df7df-e43a-48a6-8350-6b9804d13e39","Type":"ContainerStarted","Data":"2ff5971277a3c3142cca95a96138425ac26eb7fbad2744ffcefd2946366a6ba4"} Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.223728 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.272443 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.284048 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.296038 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 04 10:06:50 crc kubenswrapper[4693]: E1204 10:06:50.296660 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0dc0bac-6d04-48c3-972d-621b14a3a0d9" containerName="manila-api-log" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.296732 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0dc0bac-6d04-48c3-972d-621b14a3a0d9" containerName="manila-api-log" Dec 04 10:06:50 crc kubenswrapper[4693]: E1204 10:06:50.296793 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0dc0bac-6d04-48c3-972d-621b14a3a0d9" containerName="manila-api" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.296865 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0dc0bac-6d04-48c3-972d-621b14a3a0d9" containerName="manila-api" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.297099 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0dc0bac-6d04-48c3-972d-621b14a3a0d9" containerName="manila-api" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.301929 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0dc0bac-6d04-48c3-972d-621b14a3a0d9" containerName="manila-api-log" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.303157 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.317724 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.318090 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.317724 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.330740 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.381491 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8b6dc0f8-064a-4748-b69a-11713fe55088-etc-machine-id\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.381571 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b6dc0f8-064a-4748-b69a-11713fe55088-scripts\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.381609 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b6dc0f8-064a-4748-b69a-11713fe55088-internal-tls-certs\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.381695 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9k9z\" (UniqueName: \"kubernetes.io/projected/8b6dc0f8-064a-4748-b69a-11713fe55088-kube-api-access-v9k9z\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.381718 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b6dc0f8-064a-4748-b69a-11713fe55088-logs\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.381743 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b6dc0f8-064a-4748-b69a-11713fe55088-public-tls-certs\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.381825 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b6dc0f8-064a-4748-b69a-11713fe55088-config-data-custom\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.381851 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b6dc0f8-064a-4748-b69a-11713fe55088-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.381873 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b6dc0f8-064a-4748-b69a-11713fe55088-config-data\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.484543 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b6dc0f8-064a-4748-b69a-11713fe55088-config-data-custom\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.484582 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b6dc0f8-064a-4748-b69a-11713fe55088-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.484605 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b6dc0f8-064a-4748-b69a-11713fe55088-config-data\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.484653 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8b6dc0f8-064a-4748-b69a-11713fe55088-etc-machine-id\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.484714 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b6dc0f8-064a-4748-b69a-11713fe55088-scripts\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.484744 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b6dc0f8-064a-4748-b69a-11713fe55088-internal-tls-certs\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.484812 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9k9z\" (UniqueName: \"kubernetes.io/projected/8b6dc0f8-064a-4748-b69a-11713fe55088-kube-api-access-v9k9z\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.484834 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b6dc0f8-064a-4748-b69a-11713fe55088-logs\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.484856 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b6dc0f8-064a-4748-b69a-11713fe55088-public-tls-certs\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.487690 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0dc0bac-6d04-48c3-972d-621b14a3a0d9" path="/var/lib/kubelet/pods/e0dc0bac-6d04-48c3-972d-621b14a3a0d9/volumes" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.493705 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b6dc0f8-064a-4748-b69a-11713fe55088-logs\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.493963 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8b6dc0f8-064a-4748-b69a-11713fe55088-etc-machine-id\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.495781 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b6dc0f8-064a-4748-b69a-11713fe55088-scripts\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.496323 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b6dc0f8-064a-4748-b69a-11713fe55088-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.496588 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b6dc0f8-064a-4748-b69a-11713fe55088-config-data-custom\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.510649 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b6dc0f8-064a-4748-b69a-11713fe55088-config-data\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.511093 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b6dc0f8-064a-4748-b69a-11713fe55088-internal-tls-certs\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.511677 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b6dc0f8-064a-4748-b69a-11713fe55088-public-tls-certs\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.512129 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9k9z\" (UniqueName: \"kubernetes.io/projected/8b6dc0f8-064a-4748-b69a-11713fe55088-kube-api-access-v9k9z\") pod \"manila-api-0\" (UID: \"8b6dc0f8-064a-4748-b69a-11713fe55088\") " pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.646997 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.733635 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.733953 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerName="ceilometer-central-agent" containerID="cri-o://f20520c5b7e350733f2116daf84b7c70ee3d26f84ee0f089ea859846388b78ba" gracePeriod=30 Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.734459 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerName="proxy-httpd" containerID="cri-o://7f71f61b858d85d2bd793465730914b8ad36a5e837e13e1074182ce9eff14b24" gracePeriod=30 Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.734683 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerName="ceilometer-notification-agent" containerID="cri-o://98b4ed8e9da560f0f1abc0f68dac436bbf1c8a319104d7a5378271c9d4c02b0e" gracePeriod=30 Dec 04 10:06:50 crc kubenswrapper[4693]: I1204 10:06:50.734754 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerName="sg-core" containerID="cri-o://83fc2196f79f00e80275571d7befb3a3b1859d515a54e853d7ba0a8540eb715e" gracePeriod=30 Dec 04 10:06:51 crc kubenswrapper[4693]: I1204 10:06:51.237580 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"487df7df-e43a-48a6-8350-6b9804d13e39","Type":"ContainerStarted","Data":"f3697bbe2c0c88a892d88ae218f9d9a90b208ed51a293d1b6fd14c51262eccee"} Dec 04 10:06:51 crc kubenswrapper[4693]: I1204 10:06:51.240350 4693 generic.go:334] "Generic (PLEG): container finished" podID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerID="7f71f61b858d85d2bd793465730914b8ad36a5e837e13e1074182ce9eff14b24" exitCode=0 Dec 04 10:06:51 crc kubenswrapper[4693]: I1204 10:06:51.240378 4693 generic.go:334] "Generic (PLEG): container finished" podID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerID="83fc2196f79f00e80275571d7befb3a3b1859d515a54e853d7ba0a8540eb715e" exitCode=2 Dec 04 10:06:51 crc kubenswrapper[4693]: I1204 10:06:51.240397 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20a8b38f-6f4c-494b-839f-dfeebb7f043e","Type":"ContainerDied","Data":"7f71f61b858d85d2bd793465730914b8ad36a5e837e13e1074182ce9eff14b24"} Dec 04 10:06:51 crc kubenswrapper[4693]: I1204 10:06:51.240420 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20a8b38f-6f4c-494b-839f-dfeebb7f043e","Type":"ContainerDied","Data":"83fc2196f79f00e80275571d7befb3a3b1859d515a54e853d7ba0a8540eb715e"} Dec 04 10:06:51 crc kubenswrapper[4693]: I1204 10:06:51.263303 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.263287783 podStartE2EDuration="5.263287783s" podCreationTimestamp="2025-12-04 10:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:06:51.259287872 +0000 UTC m=+1457.156881625" watchObservedRunningTime="2025-12-04 10:06:51.263287783 +0000 UTC m=+1457.160881536" Dec 04 10:06:51 crc kubenswrapper[4693]: I1204 10:06:51.291448 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 04 10:06:51 crc kubenswrapper[4693]: I1204 10:06:51.396464 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:51 crc kubenswrapper[4693]: I1204 10:06:51.396638 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 04 10:06:51 crc kubenswrapper[4693]: I1204 10:06:51.748887 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 04 10:06:51 crc kubenswrapper[4693]: I1204 10:06:51.794876 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 04 10:06:51 crc kubenswrapper[4693]: I1204 10:06:51.974795 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.028066 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-scripts\") pod \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.028486 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-config-data\") pod \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.028527 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-combined-ca-bundle\") pod \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.028554 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-sg-core-conf-yaml\") pod \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.028696 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20a8b38f-6f4c-494b-839f-dfeebb7f043e-log-httpd\") pod \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.028743 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20a8b38f-6f4c-494b-839f-dfeebb7f043e-run-httpd\") pod \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.028763 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdw87\" (UniqueName: \"kubernetes.io/projected/20a8b38f-6f4c-494b-839f-dfeebb7f043e-kube-api-access-jdw87\") pod \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\" (UID: \"20a8b38f-6f4c-494b-839f-dfeebb7f043e\") " Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.033675 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20a8b38f-6f4c-494b-839f-dfeebb7f043e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "20a8b38f-6f4c-494b-839f-dfeebb7f043e" (UID: "20a8b38f-6f4c-494b-839f-dfeebb7f043e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.033828 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20a8b38f-6f4c-494b-839f-dfeebb7f043e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "20a8b38f-6f4c-494b-839f-dfeebb7f043e" (UID: "20a8b38f-6f4c-494b-839f-dfeebb7f043e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.052351 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20a8b38f-6f4c-494b-839f-dfeebb7f043e-kube-api-access-jdw87" (OuterVolumeSpecName: "kube-api-access-jdw87") pod "20a8b38f-6f4c-494b-839f-dfeebb7f043e" (UID: "20a8b38f-6f4c-494b-839f-dfeebb7f043e"). InnerVolumeSpecName "kube-api-access-jdw87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.054988 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-scripts" (OuterVolumeSpecName: "scripts") pod "20a8b38f-6f4c-494b-839f-dfeebb7f043e" (UID: "20a8b38f-6f4c-494b-839f-dfeebb7f043e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.116670 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "20a8b38f-6f4c-494b-839f-dfeebb7f043e" (UID: "20a8b38f-6f4c-494b-839f-dfeebb7f043e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.131067 4693 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20a8b38f-6f4c-494b-839f-dfeebb7f043e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.131116 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdw87\" (UniqueName: \"kubernetes.io/projected/20a8b38f-6f4c-494b-839f-dfeebb7f043e-kube-api-access-jdw87\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.131127 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.131136 4693 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.131145 4693 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20a8b38f-6f4c-494b-839f-dfeebb7f043e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.229539 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20a8b38f-6f4c-494b-839f-dfeebb7f043e" (UID: "20a8b38f-6f4c-494b-839f-dfeebb7f043e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.234937 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.249451 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.264564 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"8b6dc0f8-064a-4748-b69a-11713fe55088","Type":"ContainerStarted","Data":"eeea087c34926f7c5b96caca58d3e547b11ea582e1cb65bf25020f2834030276"} Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.264647 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"8b6dc0f8-064a-4748-b69a-11713fe55088","Type":"ContainerStarted","Data":"5f71b7ac9ad288987fd81587235c0c94b8fa31d86fc766468e48a751dc9fd29f"} Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.274996 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.275061 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.300710 4693 generic.go:334] "Generic (PLEG): container finished" podID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerID="98b4ed8e9da560f0f1abc0f68dac436bbf1c8a319104d7a5378271c9d4c02b0e" exitCode=0 Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.300771 4693 generic.go:334] "Generic (PLEG): container finished" podID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerID="f20520c5b7e350733f2116daf84b7c70ee3d26f84ee0f089ea859846388b78ba" exitCode=0 Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.301202 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.301360 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20a8b38f-6f4c-494b-839f-dfeebb7f043e","Type":"ContainerDied","Data":"98b4ed8e9da560f0f1abc0f68dac436bbf1c8a319104d7a5378271c9d4c02b0e"} Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.301420 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20a8b38f-6f4c-494b-839f-dfeebb7f043e","Type":"ContainerDied","Data":"f20520c5b7e350733f2116daf84b7c70ee3d26f84ee0f089ea859846388b78ba"} Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.301433 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20a8b38f-6f4c-494b-839f-dfeebb7f043e","Type":"ContainerDied","Data":"7af16363c3be5d88312f098ba5dd456eafbd3067534f81878b21c0067380ff40"} Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.301451 4693 scope.go:117] "RemoveContainer" containerID="7f71f61b858d85d2bd793465730914b8ad36a5e837e13e1074182ce9eff14b24" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.335080 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-q7wmt"] Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.335347 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" podUID="dc52ec7b-620c-419c-9f73-792c1b0c638f" containerName="dnsmasq-dns" containerID="cri-o://5928fd66d610586f0a168154118d98bdfae1b512ee17ed81e204a48ddbd1f2f0" gracePeriod=10 Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.391340 4693 scope.go:117] "RemoveContainer" containerID="83fc2196f79f00e80275571d7befb3a3b1859d515a54e853d7ba0a8540eb715e" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.407503 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-config-data" (OuterVolumeSpecName: "config-data") pod "20a8b38f-6f4c-494b-839f-dfeebb7f043e" (UID: "20a8b38f-6f4c-494b-839f-dfeebb7f043e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.438271 4693 scope.go:117] "RemoveContainer" containerID="98b4ed8e9da560f0f1abc0f68dac436bbf1c8a319104d7a5378271c9d4c02b0e" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.441714 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a8b38f-6f4c-494b-839f-dfeebb7f043e-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.518366 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.536672 4693 scope.go:117] "RemoveContainer" containerID="f20520c5b7e350733f2116daf84b7c70ee3d26f84ee0f089ea859846388b78ba" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.635047 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.645735 4693 scope.go:117] "RemoveContainer" containerID="7f71f61b858d85d2bd793465730914b8ad36a5e837e13e1074182ce9eff14b24" Dec 04 10:06:52 crc kubenswrapper[4693]: E1204 10:06:52.646435 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f71f61b858d85d2bd793465730914b8ad36a5e837e13e1074182ce9eff14b24\": container with ID starting with 7f71f61b858d85d2bd793465730914b8ad36a5e837e13e1074182ce9eff14b24 not found: ID does not exist" containerID="7f71f61b858d85d2bd793465730914b8ad36a5e837e13e1074182ce9eff14b24" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.646466 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f71f61b858d85d2bd793465730914b8ad36a5e837e13e1074182ce9eff14b24"} err="failed to get container status \"7f71f61b858d85d2bd793465730914b8ad36a5e837e13e1074182ce9eff14b24\": rpc error: code = NotFound desc = could not find container \"7f71f61b858d85d2bd793465730914b8ad36a5e837e13e1074182ce9eff14b24\": container with ID starting with 7f71f61b858d85d2bd793465730914b8ad36a5e837e13e1074182ce9eff14b24 not found: ID does not exist" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.646488 4693 scope.go:117] "RemoveContainer" containerID="83fc2196f79f00e80275571d7befb3a3b1859d515a54e853d7ba0a8540eb715e" Dec 04 10:06:52 crc kubenswrapper[4693]: E1204 10:06:52.646753 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83fc2196f79f00e80275571d7befb3a3b1859d515a54e853d7ba0a8540eb715e\": container with ID starting with 83fc2196f79f00e80275571d7befb3a3b1859d515a54e853d7ba0a8540eb715e not found: ID does not exist" containerID="83fc2196f79f00e80275571d7befb3a3b1859d515a54e853d7ba0a8540eb715e" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.646777 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83fc2196f79f00e80275571d7befb3a3b1859d515a54e853d7ba0a8540eb715e"} err="failed to get container status \"83fc2196f79f00e80275571d7befb3a3b1859d515a54e853d7ba0a8540eb715e\": rpc error: code = NotFound desc = could not find container \"83fc2196f79f00e80275571d7befb3a3b1859d515a54e853d7ba0a8540eb715e\": container with ID starting with 83fc2196f79f00e80275571d7befb3a3b1859d515a54e853d7ba0a8540eb715e not found: ID does not exist" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.646791 4693 scope.go:117] "RemoveContainer" containerID="98b4ed8e9da560f0f1abc0f68dac436bbf1c8a319104d7a5378271c9d4c02b0e" Dec 04 10:06:52 crc kubenswrapper[4693]: E1204 10:06:52.647309 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98b4ed8e9da560f0f1abc0f68dac436bbf1c8a319104d7a5378271c9d4c02b0e\": container with ID starting with 98b4ed8e9da560f0f1abc0f68dac436bbf1c8a319104d7a5378271c9d4c02b0e not found: ID does not exist" containerID="98b4ed8e9da560f0f1abc0f68dac436bbf1c8a319104d7a5378271c9d4c02b0e" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.647340 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98b4ed8e9da560f0f1abc0f68dac436bbf1c8a319104d7a5378271c9d4c02b0e"} err="failed to get container status \"98b4ed8e9da560f0f1abc0f68dac436bbf1c8a319104d7a5378271c9d4c02b0e\": rpc error: code = NotFound desc = could not find container \"98b4ed8e9da560f0f1abc0f68dac436bbf1c8a319104d7a5378271c9d4c02b0e\": container with ID starting with 98b4ed8e9da560f0f1abc0f68dac436bbf1c8a319104d7a5378271c9d4c02b0e not found: ID does not exist" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.647355 4693 scope.go:117] "RemoveContainer" containerID="f20520c5b7e350733f2116daf84b7c70ee3d26f84ee0f089ea859846388b78ba" Dec 04 10:06:52 crc kubenswrapper[4693]: E1204 10:06:52.647715 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f20520c5b7e350733f2116daf84b7c70ee3d26f84ee0f089ea859846388b78ba\": container with ID starting with f20520c5b7e350733f2116daf84b7c70ee3d26f84ee0f089ea859846388b78ba not found: ID does not exist" containerID="f20520c5b7e350733f2116daf84b7c70ee3d26f84ee0f089ea859846388b78ba" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.647736 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f20520c5b7e350733f2116daf84b7c70ee3d26f84ee0f089ea859846388b78ba"} err="failed to get container status \"f20520c5b7e350733f2116daf84b7c70ee3d26f84ee0f089ea859846388b78ba\": rpc error: code = NotFound desc = could not find container \"f20520c5b7e350733f2116daf84b7c70ee3d26f84ee0f089ea859846388b78ba\": container with ID starting with f20520c5b7e350733f2116daf84b7c70ee3d26f84ee0f089ea859846388b78ba not found: ID does not exist" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.647748 4693 scope.go:117] "RemoveContainer" containerID="7f71f61b858d85d2bd793465730914b8ad36a5e837e13e1074182ce9eff14b24" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.648046 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f71f61b858d85d2bd793465730914b8ad36a5e837e13e1074182ce9eff14b24"} err="failed to get container status \"7f71f61b858d85d2bd793465730914b8ad36a5e837e13e1074182ce9eff14b24\": rpc error: code = NotFound desc = could not find container \"7f71f61b858d85d2bd793465730914b8ad36a5e837e13e1074182ce9eff14b24\": container with ID starting with 7f71f61b858d85d2bd793465730914b8ad36a5e837e13e1074182ce9eff14b24 not found: ID does not exist" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.648062 4693 scope.go:117] "RemoveContainer" containerID="83fc2196f79f00e80275571d7befb3a3b1859d515a54e853d7ba0a8540eb715e" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.648392 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83fc2196f79f00e80275571d7befb3a3b1859d515a54e853d7ba0a8540eb715e"} err="failed to get container status \"83fc2196f79f00e80275571d7befb3a3b1859d515a54e853d7ba0a8540eb715e\": rpc error: code = NotFound desc = could not find container \"83fc2196f79f00e80275571d7befb3a3b1859d515a54e853d7ba0a8540eb715e\": container with ID starting with 83fc2196f79f00e80275571d7befb3a3b1859d515a54e853d7ba0a8540eb715e not found: ID does not exist" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.648410 4693 scope.go:117] "RemoveContainer" containerID="98b4ed8e9da560f0f1abc0f68dac436bbf1c8a319104d7a5378271c9d4c02b0e" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.648596 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98b4ed8e9da560f0f1abc0f68dac436bbf1c8a319104d7a5378271c9d4c02b0e"} err="failed to get container status \"98b4ed8e9da560f0f1abc0f68dac436bbf1c8a319104d7a5378271c9d4c02b0e\": rpc error: code = NotFound desc = could not find container \"98b4ed8e9da560f0f1abc0f68dac436bbf1c8a319104d7a5378271c9d4c02b0e\": container with ID starting with 98b4ed8e9da560f0f1abc0f68dac436bbf1c8a319104d7a5378271c9d4c02b0e not found: ID does not exist" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.648621 4693 scope.go:117] "RemoveContainer" containerID="f20520c5b7e350733f2116daf84b7c70ee3d26f84ee0f089ea859846388b78ba" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.649012 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f20520c5b7e350733f2116daf84b7c70ee3d26f84ee0f089ea859846388b78ba"} err="failed to get container status \"f20520c5b7e350733f2116daf84b7c70ee3d26f84ee0f089ea859846388b78ba\": rpc error: code = NotFound desc = could not find container \"f20520c5b7e350733f2116daf84b7c70ee3d26f84ee0f089ea859846388b78ba\": container with ID starting with f20520c5b7e350733f2116daf84b7c70ee3d26f84ee0f089ea859846388b78ba not found: ID does not exist" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.657235 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.690391 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:06:52 crc kubenswrapper[4693]: E1204 10:06:52.690843 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerName="sg-core" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.690861 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerName="sg-core" Dec 04 10:06:52 crc kubenswrapper[4693]: E1204 10:06:52.690877 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerName="ceilometer-notification-agent" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.690883 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerName="ceilometer-notification-agent" Dec 04 10:06:52 crc kubenswrapper[4693]: E1204 10:06:52.690896 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerName="proxy-httpd" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.690903 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerName="proxy-httpd" Dec 04 10:06:52 crc kubenswrapper[4693]: E1204 10:06:52.690915 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerName="ceilometer-central-agent" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.690922 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerName="ceilometer-central-agent" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.691127 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerName="proxy-httpd" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.691143 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerName="ceilometer-notification-agent" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.691155 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerName="ceilometer-central-agent" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.691176 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" containerName="sg-core" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.692836 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.700575 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.701140 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.706645 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.761489 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-config-data\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.761624 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.761809 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-log-httpd\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.761851 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkwcd\" (UniqueName: \"kubernetes.io/projected/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-kube-api-access-zkwcd\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.761884 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-run-httpd\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.762001 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.762056 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-scripts\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.863970 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-scripts\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.864365 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-config-data\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.864407 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.864445 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-log-httpd\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.864468 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkwcd\" (UniqueName: \"kubernetes.io/projected/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-kube-api-access-zkwcd\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.864489 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-run-httpd\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.864554 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.865522 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-log-httpd\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.865796 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-run-httpd\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.879102 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.887862 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.888141 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-scripts\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.888925 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-config-data\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:52 crc kubenswrapper[4693]: I1204 10:06:52.908822 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkwcd\" (UniqueName: \"kubernetes.io/projected/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-kube-api-access-zkwcd\") pod \"ceilometer-0\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " pod="openstack/ceilometer-0" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.034192 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.233491 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.284212 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-ovsdbserver-sb\") pod \"dc52ec7b-620c-419c-9f73-792c1b0c638f\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.284271 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-dns-svc\") pod \"dc52ec7b-620c-419c-9f73-792c1b0c638f\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.284401 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-ovsdbserver-nb\") pod \"dc52ec7b-620c-419c-9f73-792c1b0c638f\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.284596 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-dns-swift-storage-0\") pod \"dc52ec7b-620c-419c-9f73-792c1b0c638f\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.284688 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn9kl\" (UniqueName: \"kubernetes.io/projected/dc52ec7b-620c-419c-9f73-792c1b0c638f-kube-api-access-zn9kl\") pod \"dc52ec7b-620c-419c-9f73-792c1b0c638f\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.284756 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-config\") pod \"dc52ec7b-620c-419c-9f73-792c1b0c638f\" (UID: \"dc52ec7b-620c-419c-9f73-792c1b0c638f\") " Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.303610 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc52ec7b-620c-419c-9f73-792c1b0c638f-kube-api-access-zn9kl" (OuterVolumeSpecName: "kube-api-access-zn9kl") pod "dc52ec7b-620c-419c-9f73-792c1b0c638f" (UID: "dc52ec7b-620c-419c-9f73-792c1b0c638f"). InnerVolumeSpecName "kube-api-access-zn9kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.345951 4693 generic.go:334] "Generic (PLEG): container finished" podID="dc52ec7b-620c-419c-9f73-792c1b0c638f" containerID="5928fd66d610586f0a168154118d98bdfae1b512ee17ed81e204a48ddbd1f2f0" exitCode=0 Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.346021 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" event={"ID":"dc52ec7b-620c-419c-9f73-792c1b0c638f","Type":"ContainerDied","Data":"5928fd66d610586f0a168154118d98bdfae1b512ee17ed81e204a48ddbd1f2f0"} Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.346046 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" event={"ID":"dc52ec7b-620c-419c-9f73-792c1b0c638f","Type":"ContainerDied","Data":"f10347d472f0d077082116c20242809b6d05014160203a71b653c0c4901babc1"} Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.346065 4693 scope.go:117] "RemoveContainer" containerID="5928fd66d610586f0a168154118d98bdfae1b512ee17ed81e204a48ddbd1f2f0" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.346182 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-q7wmt" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.390561 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn9kl\" (UniqueName: \"kubernetes.io/projected/dc52ec7b-620c-419c-9f73-792c1b0c638f-kube-api-access-zn9kl\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.393806 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dc52ec7b-620c-419c-9f73-792c1b0c638f" (UID: "dc52ec7b-620c-419c-9f73-792c1b0c638f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.418940 4693 scope.go:117] "RemoveContainer" containerID="5f5c8b4f74a2cc0d36ec7b03d3f5aaf18b4a485ca82723009acadea8eb176284" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.432517 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dc52ec7b-620c-419c-9f73-792c1b0c638f" (UID: "dc52ec7b-620c-419c-9f73-792c1b0c638f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.442220 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-config" (OuterVolumeSpecName: "config") pod "dc52ec7b-620c-419c-9f73-792c1b0c638f" (UID: "dc52ec7b-620c-419c-9f73-792c1b0c638f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.471501 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dc52ec7b-620c-419c-9f73-792c1b0c638f" (UID: "dc52ec7b-620c-419c-9f73-792c1b0c638f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.475530 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dc52ec7b-620c-419c-9f73-792c1b0c638f" (UID: "dc52ec7b-620c-419c-9f73-792c1b0c638f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.487282 4693 scope.go:117] "RemoveContainer" containerID="5928fd66d610586f0a168154118d98bdfae1b512ee17ed81e204a48ddbd1f2f0" Dec 04 10:06:53 crc kubenswrapper[4693]: E1204 10:06:53.488967 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5928fd66d610586f0a168154118d98bdfae1b512ee17ed81e204a48ddbd1f2f0\": container with ID starting with 5928fd66d610586f0a168154118d98bdfae1b512ee17ed81e204a48ddbd1f2f0 not found: ID does not exist" containerID="5928fd66d610586f0a168154118d98bdfae1b512ee17ed81e204a48ddbd1f2f0" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.489015 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5928fd66d610586f0a168154118d98bdfae1b512ee17ed81e204a48ddbd1f2f0"} err="failed to get container status \"5928fd66d610586f0a168154118d98bdfae1b512ee17ed81e204a48ddbd1f2f0\": rpc error: code = NotFound desc = could not find container \"5928fd66d610586f0a168154118d98bdfae1b512ee17ed81e204a48ddbd1f2f0\": container with ID starting with 5928fd66d610586f0a168154118d98bdfae1b512ee17ed81e204a48ddbd1f2f0 not found: ID does not exist" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.489044 4693 scope.go:117] "RemoveContainer" containerID="5f5c8b4f74a2cc0d36ec7b03d3f5aaf18b4a485ca82723009acadea8eb176284" Dec 04 10:06:53 crc kubenswrapper[4693]: E1204 10:06:53.489582 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f5c8b4f74a2cc0d36ec7b03d3f5aaf18b4a485ca82723009acadea8eb176284\": container with ID starting with 5f5c8b4f74a2cc0d36ec7b03d3f5aaf18b4a485ca82723009acadea8eb176284 not found: ID does not exist" containerID="5f5c8b4f74a2cc0d36ec7b03d3f5aaf18b4a485ca82723009acadea8eb176284" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.489606 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f5c8b4f74a2cc0d36ec7b03d3f5aaf18b4a485ca82723009acadea8eb176284"} err="failed to get container status \"5f5c8b4f74a2cc0d36ec7b03d3f5aaf18b4a485ca82723009acadea8eb176284\": rpc error: code = NotFound desc = could not find container \"5f5c8b4f74a2cc0d36ec7b03d3f5aaf18b4a485ca82723009acadea8eb176284\": container with ID starting with 5f5c8b4f74a2cc0d36ec7b03d3f5aaf18b4a485ca82723009acadea8eb176284 not found: ID does not exist" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.496203 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.496232 4693 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.496264 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.496275 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.496283 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc52ec7b-620c-419c-9f73-792c1b0c638f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.674935 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:06:53 crc kubenswrapper[4693]: W1204 10:06:53.700523 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda5ac9c1_2338_4739_ac19_e8f9fa16ac59.slice/crio-9b7160c05528bc9e7bf3ede952778c7f86ae410a6575abc953de85b010547966 WatchSource:0}: Error finding container 9b7160c05528bc9e7bf3ede952778c7f86ae410a6575abc953de85b010547966: Status 404 returned error can't find the container with id 9b7160c05528bc9e7bf3ede952778c7f86ae410a6575abc953de85b010547966 Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.712145 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-q7wmt"] Dec 04 10:06:53 crc kubenswrapper[4693]: I1204 10:06:53.732832 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-q7wmt"] Dec 04 10:06:54 crc kubenswrapper[4693]: I1204 10:06:54.401372 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"8b6dc0f8-064a-4748-b69a-11713fe55088","Type":"ContainerStarted","Data":"258703f53c35a7a1d77161e83312f0bae0a65b81c9bd5a0d09f90b909a1be2a2"} Dec 04 10:06:54 crc kubenswrapper[4693]: I1204 10:06:54.402989 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 04 10:06:54 crc kubenswrapper[4693]: I1204 10:06:54.407643 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5ac9c1-2338-4739-ac19-e8f9fa16ac59","Type":"ContainerStarted","Data":"9b7160c05528bc9e7bf3ede952778c7f86ae410a6575abc953de85b010547966"} Dec 04 10:06:54 crc kubenswrapper[4693]: I1204 10:06:54.442940 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.442908195 podStartE2EDuration="4.442908195s" podCreationTimestamp="2025-12-04 10:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:06:54.426972265 +0000 UTC m=+1460.324566018" watchObservedRunningTime="2025-12-04 10:06:54.442908195 +0000 UTC m=+1460.340501958" Dec 04 10:06:54 crc kubenswrapper[4693]: I1204 10:06:54.503468 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20a8b38f-6f4c-494b-839f-dfeebb7f043e" path="/var/lib/kubelet/pods/20a8b38f-6f4c-494b-839f-dfeebb7f043e/volumes" Dec 04 10:06:54 crc kubenswrapper[4693]: I1204 10:06:54.504998 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc52ec7b-620c-419c-9f73-792c1b0c638f" path="/var/lib/kubelet/pods/dc52ec7b-620c-419c-9f73-792c1b0c638f/volumes" Dec 04 10:06:54 crc kubenswrapper[4693]: I1204 10:06:54.617744 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:06:56 crc kubenswrapper[4693]: I1204 10:06:56.567261 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 04 10:06:56 crc kubenswrapper[4693]: I1204 10:06:56.644893 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Dec 04 10:06:57 crc kubenswrapper[4693]: I1204 10:06:57.861578 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 04 10:07:03 crc kubenswrapper[4693]: I1204 10:07:03.779481 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 04 10:07:03 crc kubenswrapper[4693]: I1204 10:07:03.842593 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 10:07:04 crc kubenswrapper[4693]: I1204 10:07:04.559984 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="50f43b88-4cfe-4cc7-95be-d377136831e4" containerName="manila-scheduler" containerID="cri-o://7ed77c5fe0e00b14fc45c32c458ac5b7d9703c256ba2bb01f5ade60a906cb7c7" gracePeriod=30 Dec 04 10:07:04 crc kubenswrapper[4693]: I1204 10:07:04.560504 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="50f43b88-4cfe-4cc7-95be-d377136831e4" containerName="probe" containerID="cri-o://74e817bed32c958cc6ae1123f29c40f1f24cac3f4a4c62bd4357bac0a3d0f2d2" gracePeriod=30 Dec 04 10:07:05 crc kubenswrapper[4693]: I1204 10:07:05.574040 4693 generic.go:334] "Generic (PLEG): container finished" podID="50f43b88-4cfe-4cc7-95be-d377136831e4" containerID="74e817bed32c958cc6ae1123f29c40f1f24cac3f4a4c62bd4357bac0a3d0f2d2" exitCode=0 Dec 04 10:07:05 crc kubenswrapper[4693]: I1204 10:07:05.574395 4693 generic.go:334] "Generic (PLEG): container finished" podID="50f43b88-4cfe-4cc7-95be-d377136831e4" containerID="7ed77c5fe0e00b14fc45c32c458ac5b7d9703c256ba2bb01f5ade60a906cb7c7" exitCode=0 Dec 04 10:07:05 crc kubenswrapper[4693]: I1204 10:07:05.574123 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"50f43b88-4cfe-4cc7-95be-d377136831e4","Type":"ContainerDied","Data":"74e817bed32c958cc6ae1123f29c40f1f24cac3f4a4c62bd4357bac0a3d0f2d2"} Dec 04 10:07:05 crc kubenswrapper[4693]: I1204 10:07:05.574443 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"50f43b88-4cfe-4cc7-95be-d377136831e4","Type":"ContainerDied","Data":"7ed77c5fe0e00b14fc45c32c458ac5b7d9703c256ba2bb01f5ade60a906cb7c7"} Dec 04 10:07:05 crc kubenswrapper[4693]: I1204 10:07:05.975220 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:07:05 crc kubenswrapper[4693]: I1204 10:07:05.975537 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="eb7f08d0-230c-4f83-b559-7cd16b4629ea" containerName="glance-log" containerID="cri-o://4392f0b7049337839504d250e6ebc64937035f7589a6309cf39ca4378181567c" gracePeriod=30 Dec 04 10:07:05 crc kubenswrapper[4693]: I1204 10:07:05.975627 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="eb7f08d0-230c-4f83-b559-7cd16b4629ea" containerName="glance-httpd" containerID="cri-o://4c5e19c3b9d3c1e1b8b40d70d16b050682e30ec966883be7ebb5bf627f58c55b" gracePeriod=30 Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.390132 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.516285 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-config-data-custom\") pod \"50f43b88-4cfe-4cc7-95be-d377136831e4\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.516367 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50f43b88-4cfe-4cc7-95be-d377136831e4-etc-machine-id\") pod \"50f43b88-4cfe-4cc7-95be-d377136831e4\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.516475 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-combined-ca-bundle\") pod \"50f43b88-4cfe-4cc7-95be-d377136831e4\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.516575 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-config-data\") pod \"50f43b88-4cfe-4cc7-95be-d377136831e4\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.516695 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ptjx\" (UniqueName: \"kubernetes.io/projected/50f43b88-4cfe-4cc7-95be-d377136831e4-kube-api-access-6ptjx\") pod \"50f43b88-4cfe-4cc7-95be-d377136831e4\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.516799 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50f43b88-4cfe-4cc7-95be-d377136831e4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "50f43b88-4cfe-4cc7-95be-d377136831e4" (UID: "50f43b88-4cfe-4cc7-95be-d377136831e4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.516833 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-scripts\") pod \"50f43b88-4cfe-4cc7-95be-d377136831e4\" (UID: \"50f43b88-4cfe-4cc7-95be-d377136831e4\") " Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.518289 4693 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50f43b88-4cfe-4cc7-95be-d377136831e4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.523870 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-scripts" (OuterVolumeSpecName: "scripts") pod "50f43b88-4cfe-4cc7-95be-d377136831e4" (UID: "50f43b88-4cfe-4cc7-95be-d377136831e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.524283 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "50f43b88-4cfe-4cc7-95be-d377136831e4" (UID: "50f43b88-4cfe-4cc7-95be-d377136831e4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.535666 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f43b88-4cfe-4cc7-95be-d377136831e4-kube-api-access-6ptjx" (OuterVolumeSpecName: "kube-api-access-6ptjx") pod "50f43b88-4cfe-4cc7-95be-d377136831e4" (UID: "50f43b88-4cfe-4cc7-95be-d377136831e4"). InnerVolumeSpecName "kube-api-access-6ptjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:07:06 crc kubenswrapper[4693]: E1204 10:07:06.540289 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified" Dec 04 10:07:06 crc kubenswrapper[4693]: E1204 10:07:06.540693 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-cell0-conductor-db-sync,Image:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CELL_NAME,Value:cell0,ValueFrom:nil,},EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:false,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/kolla/config_files/config.json,SubPath:nova-conductor-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ldd4s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-cell0-conductor-db-sync-ttkgx_openstack(b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:07:06 crc kubenswrapper[4693]: E1204 10:07:06.541884 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/nova-cell0-conductor-db-sync-ttkgx" podUID="b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.585764 4693 generic.go:334] "Generic (PLEG): container finished" podID="eb7f08d0-230c-4f83-b559-7cd16b4629ea" containerID="4392f0b7049337839504d250e6ebc64937035f7589a6309cf39ca4378181567c" exitCode=143 Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.586091 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eb7f08d0-230c-4f83-b559-7cd16b4629ea","Type":"ContainerDied","Data":"4392f0b7049337839504d250e6ebc64937035f7589a6309cf39ca4378181567c"} Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.596535 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50f43b88-4cfe-4cc7-95be-d377136831e4" (UID: "50f43b88-4cfe-4cc7-95be-d377136831e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.601464 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.601463 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"50f43b88-4cfe-4cc7-95be-d377136831e4","Type":"ContainerDied","Data":"27315d7eaccc71d103a614b5010c5f56dc82773602872aa8ce62436a51364861"} Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.601586 4693 scope.go:117] "RemoveContainer" containerID="74e817bed32c958cc6ae1123f29c40f1f24cac3f4a4c62bd4357bac0a3d0f2d2" Dec 04 10:07:06 crc kubenswrapper[4693]: E1204 10:07:06.602780 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified\\\"\"" pod="openstack/nova-cell0-conductor-db-sync-ttkgx" podUID="b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.619699 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.620008 4693 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.620026 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.620035 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ptjx\" (UniqueName: \"kubernetes.io/projected/50f43b88-4cfe-4cc7-95be-d377136831e4-kube-api-access-6ptjx\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.653463 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-config-data" (OuterVolumeSpecName: "config-data") pod "50f43b88-4cfe-4cc7-95be-d377136831e4" (UID: "50f43b88-4cfe-4cc7-95be-d377136831e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.716756 4693 scope.go:117] "RemoveContainer" containerID="7ed77c5fe0e00b14fc45c32c458ac5b7d9703c256ba2bb01f5ade60a906cb7c7" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.725219 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f43b88-4cfe-4cc7-95be-d377136831e4-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.933852 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.941880 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.963655 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 10:07:06 crc kubenswrapper[4693]: E1204 10:07:06.964028 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f43b88-4cfe-4cc7-95be-d377136831e4" containerName="probe" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.964041 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f43b88-4cfe-4cc7-95be-d377136831e4" containerName="probe" Dec 04 10:07:06 crc kubenswrapper[4693]: E1204 10:07:06.964072 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc52ec7b-620c-419c-9f73-792c1b0c638f" containerName="dnsmasq-dns" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.964078 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc52ec7b-620c-419c-9f73-792c1b0c638f" containerName="dnsmasq-dns" Dec 04 10:07:06 crc kubenswrapper[4693]: E1204 10:07:06.964088 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc52ec7b-620c-419c-9f73-792c1b0c638f" containerName="init" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.964095 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc52ec7b-620c-419c-9f73-792c1b0c638f" containerName="init" Dec 04 10:07:06 crc kubenswrapper[4693]: E1204 10:07:06.964108 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f43b88-4cfe-4cc7-95be-d377136831e4" containerName="manila-scheduler" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.964113 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f43b88-4cfe-4cc7-95be-d377136831e4" containerName="manila-scheduler" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.964260 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc52ec7b-620c-419c-9f73-792c1b0c638f" containerName="dnsmasq-dns" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.964284 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f43b88-4cfe-4cc7-95be-d377136831e4" containerName="manila-scheduler" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.964300 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f43b88-4cfe-4cc7-95be-d377136831e4" containerName="probe" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.965348 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.967420 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 04 10:07:06 crc kubenswrapper[4693]: I1204 10:07:06.983408 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.132429 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f40f194-33e3-4723-817f-981394e545b9-scripts\") pod \"manila-scheduler-0\" (UID: \"5f40f194-33e3-4723-817f-981394e545b9\") " pod="openstack/manila-scheduler-0" Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.132484 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f40f194-33e3-4723-817f-981394e545b9-config-data\") pod \"manila-scheduler-0\" (UID: \"5f40f194-33e3-4723-817f-981394e545b9\") " pod="openstack/manila-scheduler-0" Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.132518 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f40f194-33e3-4723-817f-981394e545b9-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"5f40f194-33e3-4723-817f-981394e545b9\") " pod="openstack/manila-scheduler-0" Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.132576 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5f40f194-33e3-4723-817f-981394e545b9-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"5f40f194-33e3-4723-817f-981394e545b9\") " pod="openstack/manila-scheduler-0" Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.132678 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f40f194-33e3-4723-817f-981394e545b9-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"5f40f194-33e3-4723-817f-981394e545b9\") " pod="openstack/manila-scheduler-0" Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.132743 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tfm4\" (UniqueName: \"kubernetes.io/projected/5f40f194-33e3-4723-817f-981394e545b9-kube-api-access-4tfm4\") pod \"manila-scheduler-0\" (UID: \"5f40f194-33e3-4723-817f-981394e545b9\") " pod="openstack/manila-scheduler-0" Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.235414 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f40f194-33e3-4723-817f-981394e545b9-scripts\") pod \"manila-scheduler-0\" (UID: \"5f40f194-33e3-4723-817f-981394e545b9\") " pod="openstack/manila-scheduler-0" Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.235803 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f40f194-33e3-4723-817f-981394e545b9-config-data\") pod \"manila-scheduler-0\" (UID: \"5f40f194-33e3-4723-817f-981394e545b9\") " pod="openstack/manila-scheduler-0" Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.235840 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f40f194-33e3-4723-817f-981394e545b9-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"5f40f194-33e3-4723-817f-981394e545b9\") " pod="openstack/manila-scheduler-0" Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.235908 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5f40f194-33e3-4723-817f-981394e545b9-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"5f40f194-33e3-4723-817f-981394e545b9\") " pod="openstack/manila-scheduler-0" Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.236030 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f40f194-33e3-4723-817f-981394e545b9-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"5f40f194-33e3-4723-817f-981394e545b9\") " pod="openstack/manila-scheduler-0" Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.236103 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tfm4\" (UniqueName: \"kubernetes.io/projected/5f40f194-33e3-4723-817f-981394e545b9-kube-api-access-4tfm4\") pod \"manila-scheduler-0\" (UID: \"5f40f194-33e3-4723-817f-981394e545b9\") " pod="openstack/manila-scheduler-0" Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.242229 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5f40f194-33e3-4723-817f-981394e545b9-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"5f40f194-33e3-4723-817f-981394e545b9\") " pod="openstack/manila-scheduler-0" Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.247171 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f40f194-33e3-4723-817f-981394e545b9-config-data\") pod \"manila-scheduler-0\" (UID: \"5f40f194-33e3-4723-817f-981394e545b9\") " pod="openstack/manila-scheduler-0" Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.247883 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f40f194-33e3-4723-817f-981394e545b9-scripts\") pod \"manila-scheduler-0\" (UID: \"5f40f194-33e3-4723-817f-981394e545b9\") " pod="openstack/manila-scheduler-0" Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.248018 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f40f194-33e3-4723-817f-981394e545b9-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"5f40f194-33e3-4723-817f-981394e545b9\") " pod="openstack/manila-scheduler-0" Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.248367 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f40f194-33e3-4723-817f-981394e545b9-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"5f40f194-33e3-4723-817f-981394e545b9\") " pod="openstack/manila-scheduler-0" Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.261916 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tfm4\" (UniqueName: \"kubernetes.io/projected/5f40f194-33e3-4723-817f-981394e545b9-kube-api-access-4tfm4\") pod \"manila-scheduler-0\" (UID: \"5f40f194-33e3-4723-817f-981394e545b9\") " pod="openstack/manila-scheduler-0" Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.294966 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.322101 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.322349 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0c2dadd4-e719-4ec8-915e-683db6276f04" containerName="glance-log" containerID="cri-o://f29775d6004cfb3d5fe9f692378a399855b2c4276628128cb7b5fd2a2e2d9910" gracePeriod=30 Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.323362 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0c2dadd4-e719-4ec8-915e-683db6276f04" containerName="glance-httpd" containerID="cri-o://ac3055f045b3812d37fde41974980bdadf2b3b77f8ee94f92d3619e2149fcaf0" gracePeriod=30 Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.622850 4693 generic.go:334] "Generic (PLEG): container finished" podID="0c2dadd4-e719-4ec8-915e-683db6276f04" containerID="f29775d6004cfb3d5fe9f692378a399855b2c4276628128cb7b5fd2a2e2d9910" exitCode=143 Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.623099 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0c2dadd4-e719-4ec8-915e-683db6276f04","Type":"ContainerDied","Data":"f29775d6004cfb3d5fe9f692378a399855b2c4276628128cb7b5fd2a2e2d9910"} Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.636810 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"9a4f028e-3364-435d-8fef-234e98c9b6a1","Type":"ContainerStarted","Data":"7de135c7ef6f1f1116dd516ca81476601668d87b6f54596580053971c69543ba"} Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.643003 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5ac9c1-2338-4739-ac19-e8f9fa16ac59","Type":"ContainerStarted","Data":"15231f5b973c0a1d401bb6078ca3376b6b46165a7c4a1ad65db8530e4954b007"} Dec 04 10:07:07 crc kubenswrapper[4693]: I1204 10:07:07.903648 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 10:07:08 crc kubenswrapper[4693]: I1204 10:07:08.474145 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50f43b88-4cfe-4cc7-95be-d377136831e4" path="/var/lib/kubelet/pods/50f43b88-4cfe-4cc7-95be-d377136831e4/volumes" Dec 04 10:07:08 crc kubenswrapper[4693]: I1204 10:07:08.674620 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"5f40f194-33e3-4723-817f-981394e545b9","Type":"ContainerStarted","Data":"2f55eef6bdcc2843f6afbf5e69c8fd93fe86b425b980be76815f0ff1712743be"} Dec 04 10:07:08 crc kubenswrapper[4693]: I1204 10:07:08.675747 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"5f40f194-33e3-4723-817f-981394e545b9","Type":"ContainerStarted","Data":"89ca6281080e5ed68505f063672746e70755f52601362c2dcbaa124044008085"} Dec 04 10:07:08 crc kubenswrapper[4693]: I1204 10:07:08.684039 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"9a4f028e-3364-435d-8fef-234e98c9b6a1","Type":"ContainerStarted","Data":"3b2406d3df026daccf3ccdbc2efaaca0950bdaaf807df6861733abf3457c4237"} Dec 04 10:07:08 crc kubenswrapper[4693]: I1204 10:07:08.690402 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5ac9c1-2338-4739-ac19-e8f9fa16ac59","Type":"ContainerStarted","Data":"095638f543e42621b72ed1ddc79fded1d896e3a559323a7f84022b469f9cdef9"} Dec 04 10:07:08 crc kubenswrapper[4693]: I1204 10:07:08.734687 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.234950717 podStartE2EDuration="27.734667166s" podCreationTimestamp="2025-12-04 10:06:41 +0000 UTC" firstStartedPulling="2025-12-04 10:06:43.039517092 +0000 UTC m=+1448.937110845" lastFinishedPulling="2025-12-04 10:07:06.539233551 +0000 UTC m=+1472.436827294" observedRunningTime="2025-12-04 10:07:08.701573353 +0000 UTC m=+1474.599167106" watchObservedRunningTime="2025-12-04 10:07:08.734667166 +0000 UTC m=+1474.632260919" Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.701839 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5ac9c1-2338-4739-ac19-e8f9fa16ac59","Type":"ContainerStarted","Data":"69de9d45b5b2724e606e947200cde8a18e57c19fe4f4e1f1e80e320caee74ed8"} Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.706077 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"5f40f194-33e3-4723-817f-981394e545b9","Type":"ContainerStarted","Data":"212921cfcaa88ffcdfe62b73fa1ff8ea6dd606ada0cf89939bdd1615cfb6179d"} Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.709100 4693 generic.go:334] "Generic (PLEG): container finished" podID="eb7f08d0-230c-4f83-b559-7cd16b4629ea" containerID="4c5e19c3b9d3c1e1b8b40d70d16b050682e30ec966883be7ebb5bf627f58c55b" exitCode=0 Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.709980 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eb7f08d0-230c-4f83-b559-7cd16b4629ea","Type":"ContainerDied","Data":"4c5e19c3b9d3c1e1b8b40d70d16b050682e30ec966883be7ebb5bf627f58c55b"} Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.710029 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eb7f08d0-230c-4f83-b559-7cd16b4629ea","Type":"ContainerDied","Data":"25e768c65269c732647a35196ac631ecb5a8c56bf94bd2ff253d434c5544ff08"} Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.710042 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25e768c65269c732647a35196ac631ecb5a8c56bf94bd2ff253d434c5544ff08" Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.739483 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.739456293 podStartE2EDuration="3.739456293s" podCreationTimestamp="2025-12-04 10:07:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:07:09.726510145 +0000 UTC m=+1475.624103908" watchObservedRunningTime="2025-12-04 10:07:09.739456293 +0000 UTC m=+1475.637050046" Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.764074 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.904064 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb7f08d0-230c-4f83-b559-7cd16b4629ea-logs\") pod \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.904159 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-combined-ca-bundle\") pod \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.904189 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lncd\" (UniqueName: \"kubernetes.io/projected/eb7f08d0-230c-4f83-b559-7cd16b4629ea-kube-api-access-6lncd\") pod \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.904206 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.904226 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb7f08d0-230c-4f83-b559-7cd16b4629ea-httpd-run\") pod \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.904289 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-scripts\") pod \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.904321 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eb7f08d0-230c-4f83-b559-7cd16b4629ea-ceph\") pod \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.904510 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-config-data\") pod \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.904561 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-public-tls-certs\") pod \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\" (UID: \"eb7f08d0-230c-4f83-b559-7cd16b4629ea\") " Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.904980 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb7f08d0-230c-4f83-b559-7cd16b4629ea-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eb7f08d0-230c-4f83-b559-7cd16b4629ea" (UID: "eb7f08d0-230c-4f83-b559-7cd16b4629ea"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.906159 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb7f08d0-230c-4f83-b559-7cd16b4629ea-logs" (OuterVolumeSpecName: "logs") pod "eb7f08d0-230c-4f83-b559-7cd16b4629ea" (UID: "eb7f08d0-230c-4f83-b559-7cd16b4629ea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.911574 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "eb7f08d0-230c-4f83-b559-7cd16b4629ea" (UID: "eb7f08d0-230c-4f83-b559-7cd16b4629ea"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.911719 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb7f08d0-230c-4f83-b559-7cd16b4629ea-kube-api-access-6lncd" (OuterVolumeSpecName: "kube-api-access-6lncd") pod "eb7f08d0-230c-4f83-b559-7cd16b4629ea" (UID: "eb7f08d0-230c-4f83-b559-7cd16b4629ea"). InnerVolumeSpecName "kube-api-access-6lncd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.911784 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb7f08d0-230c-4f83-b559-7cd16b4629ea-ceph" (OuterVolumeSpecName: "ceph") pod "eb7f08d0-230c-4f83-b559-7cd16b4629ea" (UID: "eb7f08d0-230c-4f83-b559-7cd16b4629ea"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.928525 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-scripts" (OuterVolumeSpecName: "scripts") pod "eb7f08d0-230c-4f83-b559-7cd16b4629ea" (UID: "eb7f08d0-230c-4f83-b559-7cd16b4629ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.957576 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb7f08d0-230c-4f83-b559-7cd16b4629ea" (UID: "eb7f08d0-230c-4f83-b559-7cd16b4629ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.969382 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eb7f08d0-230c-4f83-b559-7cd16b4629ea" (UID: "eb7f08d0-230c-4f83-b559-7cd16b4629ea"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:09 crc kubenswrapper[4693]: I1204 10:07:09.992352 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-config-data" (OuterVolumeSpecName: "config-data") pod "eb7f08d0-230c-4f83-b559-7cd16b4629ea" (UID: "eb7f08d0-230c-4f83-b559-7cd16b4629ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.013561 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.013786 4693 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.013857 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb7f08d0-230c-4f83-b559-7cd16b4629ea-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.013944 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.013999 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lncd\" (UniqueName: \"kubernetes.io/projected/eb7f08d0-230c-4f83-b559-7cd16b4629ea-kube-api-access-6lncd\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.014083 4693 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.014167 4693 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb7f08d0-230c-4f83-b559-7cd16b4629ea-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.014231 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb7f08d0-230c-4f83-b559-7cd16b4629ea-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.014290 4693 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/eb7f08d0-230c-4f83-b559-7cd16b4629ea-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.034856 4693 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.116348 4693 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.543761 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="0c2dadd4-e719-4ec8-915e-683db6276f04" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": read tcp 10.217.0.2:41846->10.217.0.154:9292: read: connection reset by peer" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.544020 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="0c2dadd4-e719-4ec8-915e-683db6276f04" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": read tcp 10.217.0.2:41858->10.217.0.154:9292: read: connection reset by peer" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.755300 4693 generic.go:334] "Generic (PLEG): container finished" podID="0c2dadd4-e719-4ec8-915e-683db6276f04" containerID="ac3055f045b3812d37fde41974980bdadf2b3b77f8ee94f92d3619e2149fcaf0" exitCode=0 Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.755476 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.756444 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0c2dadd4-e719-4ec8-915e-683db6276f04","Type":"ContainerDied","Data":"ac3055f045b3812d37fde41974980bdadf2b3b77f8ee94f92d3619e2149fcaf0"} Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.909824 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.919841 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.943599 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:07:10 crc kubenswrapper[4693]: E1204 10:07:10.944058 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb7f08d0-230c-4f83-b559-7cd16b4629ea" containerName="glance-log" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.944075 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb7f08d0-230c-4f83-b559-7cd16b4629ea" containerName="glance-log" Dec 04 10:07:10 crc kubenswrapper[4693]: E1204 10:07:10.944085 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb7f08d0-230c-4f83-b559-7cd16b4629ea" containerName="glance-httpd" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.944092 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb7f08d0-230c-4f83-b559-7cd16b4629ea" containerName="glance-httpd" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.944271 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb7f08d0-230c-4f83-b559-7cd16b4629ea" containerName="glance-log" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.944287 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb7f08d0-230c-4f83-b559-7cd16b4629ea" containerName="glance-httpd" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.945247 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.947580 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.954822 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 04 10:07:10 crc kubenswrapper[4693]: I1204 10:07:10.974190 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.034419 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56bd6fe8-e97b-4c07-a204-ee44c09401b7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.034487 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx7d6\" (UniqueName: \"kubernetes.io/projected/56bd6fe8-e97b-4c07-a204-ee44c09401b7-kube-api-access-qx7d6\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.034600 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56bd6fe8-e97b-4c07-a204-ee44c09401b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.034787 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.034902 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/56bd6fe8-e97b-4c07-a204-ee44c09401b7-ceph\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.035066 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/56bd6fe8-e97b-4c07-a204-ee44c09401b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.035114 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56bd6fe8-e97b-4c07-a204-ee44c09401b7-logs\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.035158 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56bd6fe8-e97b-4c07-a204-ee44c09401b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.035180 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56bd6fe8-e97b-4c07-a204-ee44c09401b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.145407 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/56bd6fe8-e97b-4c07-a204-ee44c09401b7-ceph\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.145507 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/56bd6fe8-e97b-4c07-a204-ee44c09401b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.145536 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56bd6fe8-e97b-4c07-a204-ee44c09401b7-logs\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.145563 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56bd6fe8-e97b-4c07-a204-ee44c09401b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.145580 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56bd6fe8-e97b-4c07-a204-ee44c09401b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.145648 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56bd6fe8-e97b-4c07-a204-ee44c09401b7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.145667 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx7d6\" (UniqueName: \"kubernetes.io/projected/56bd6fe8-e97b-4c07-a204-ee44c09401b7-kube-api-access-qx7d6\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.145689 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56bd6fe8-e97b-4c07-a204-ee44c09401b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.145719 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.146132 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.146781 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56bd6fe8-e97b-4c07-a204-ee44c09401b7-logs\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.147005 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/56bd6fe8-e97b-4c07-a204-ee44c09401b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.165939 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56bd6fe8-e97b-4c07-a204-ee44c09401b7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.167901 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56bd6fe8-e97b-4c07-a204-ee44c09401b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.170375 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56bd6fe8-e97b-4c07-a204-ee44c09401b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.172002 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/56bd6fe8-e97b-4c07-a204-ee44c09401b7-ceph\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.172917 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56bd6fe8-e97b-4c07-a204-ee44c09401b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.199419 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx7d6\" (UniqueName: \"kubernetes.io/projected/56bd6fe8-e97b-4c07-a204-ee44c09401b7-kube-api-access-qx7d6\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.258642 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"56bd6fe8-e97b-4c07-a204-ee44c09401b7\") " pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.266006 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.416029 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.554094 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c2dadd4-e719-4ec8-915e-683db6276f04-logs\") pod \"0c2dadd4-e719-4ec8-915e-683db6276f04\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.554177 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-combined-ca-bundle\") pod \"0c2dadd4-e719-4ec8-915e-683db6276f04\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.554241 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c2dadd4-e719-4ec8-915e-683db6276f04-httpd-run\") pod \"0c2dadd4-e719-4ec8-915e-683db6276f04\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.554302 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-config-data\") pod \"0c2dadd4-e719-4ec8-915e-683db6276f04\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.554391 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-scripts\") pod \"0c2dadd4-e719-4ec8-915e-683db6276f04\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.554409 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0c2dadd4-e719-4ec8-915e-683db6276f04-ceph\") pod \"0c2dadd4-e719-4ec8-915e-683db6276f04\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.554444 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-internal-tls-certs\") pod \"0c2dadd4-e719-4ec8-915e-683db6276f04\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.554713 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c2dadd4-e719-4ec8-915e-683db6276f04-logs" (OuterVolumeSpecName: "logs") pod "0c2dadd4-e719-4ec8-915e-683db6276f04" (UID: "0c2dadd4-e719-4ec8-915e-683db6276f04"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.555243 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpzjq\" (UniqueName: \"kubernetes.io/projected/0c2dadd4-e719-4ec8-915e-683db6276f04-kube-api-access-zpzjq\") pod \"0c2dadd4-e719-4ec8-915e-683db6276f04\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.555350 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"0c2dadd4-e719-4ec8-915e-683db6276f04\" (UID: \"0c2dadd4-e719-4ec8-915e-683db6276f04\") " Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.556044 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c2dadd4-e719-4ec8-915e-683db6276f04-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.560497 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-scripts" (OuterVolumeSpecName: "scripts") pod "0c2dadd4-e719-4ec8-915e-683db6276f04" (UID: "0c2dadd4-e719-4ec8-915e-683db6276f04"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.561574 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c2dadd4-e719-4ec8-915e-683db6276f04-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0c2dadd4-e719-4ec8-915e-683db6276f04" (UID: "0c2dadd4-e719-4ec8-915e-683db6276f04"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.567294 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c2dadd4-e719-4ec8-915e-683db6276f04-ceph" (OuterVolumeSpecName: "ceph") pod "0c2dadd4-e719-4ec8-915e-683db6276f04" (UID: "0c2dadd4-e719-4ec8-915e-683db6276f04"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.575513 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "0c2dadd4-e719-4ec8-915e-683db6276f04" (UID: "0c2dadd4-e719-4ec8-915e-683db6276f04"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.577541 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c2dadd4-e719-4ec8-915e-683db6276f04-kube-api-access-zpzjq" (OuterVolumeSpecName: "kube-api-access-zpzjq") pod "0c2dadd4-e719-4ec8-915e-683db6276f04" (UID: "0c2dadd4-e719-4ec8-915e-683db6276f04"). InnerVolumeSpecName "kube-api-access-zpzjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.640596 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c2dadd4-e719-4ec8-915e-683db6276f04" (UID: "0c2dadd4-e719-4ec8-915e-683db6276f04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.642753 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-config-data" (OuterVolumeSpecName: "config-data") pod "0c2dadd4-e719-4ec8-915e-683db6276f04" (UID: "0c2dadd4-e719-4ec8-915e-683db6276f04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.658594 4693 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c2dadd4-e719-4ec8-915e-683db6276f04-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.658645 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.658662 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.658673 4693 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0c2dadd4-e719-4ec8-915e-683db6276f04-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.658684 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpzjq\" (UniqueName: \"kubernetes.io/projected/0c2dadd4-e719-4ec8-915e-683db6276f04-kube-api-access-zpzjq\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.658723 4693 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.658735 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.672542 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0c2dadd4-e719-4ec8-915e-683db6276f04" (UID: "0c2dadd4-e719-4ec8-915e-683db6276f04"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.691050 4693 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.760889 4693 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c2dadd4-e719-4ec8-915e-683db6276f04-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.760934 4693 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.769490 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5ac9c1-2338-4739-ac19-e8f9fa16ac59","Type":"ContainerStarted","Data":"4005382078fba3624e1a72f92a1f2dbf77c04eda77e8c99d486c3a4c6bd13e83"} Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.769637 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerName="ceilometer-central-agent" containerID="cri-o://15231f5b973c0a1d401bb6078ca3376b6b46165a7c4a1ad65db8530e4954b007" gracePeriod=30 Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.769660 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerName="proxy-httpd" containerID="cri-o://4005382078fba3624e1a72f92a1f2dbf77c04eda77e8c99d486c3a4c6bd13e83" gracePeriod=30 Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.769666 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.769783 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerName="sg-core" containerID="cri-o://69de9d45b5b2724e606e947200cde8a18e57c19fe4f4e1f1e80e320caee74ed8" gracePeriod=30 Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.769800 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerName="ceilometer-notification-agent" containerID="cri-o://095638f543e42621b72ed1ddc79fded1d896e3a559323a7f84022b469f9cdef9" gracePeriod=30 Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.774121 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0c2dadd4-e719-4ec8-915e-683db6276f04","Type":"ContainerDied","Data":"922ec2c7f93c103088ce3cd5934ffef4268f2ef2ff72a70109b5e4b95f7d879e"} Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.774167 4693 scope.go:117] "RemoveContainer" containerID="ac3055f045b3812d37fde41974980bdadf2b3b77f8ee94f92d3619e2149fcaf0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.774279 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.804319 4693 scope.go:117] "RemoveContainer" containerID="f29775d6004cfb3d5fe9f692378a399855b2c4276628128cb7b5fd2a2e2d9910" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.812266 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.7951381299999998 podStartE2EDuration="19.81223936s" podCreationTimestamp="2025-12-04 10:06:52 +0000 UTC" firstStartedPulling="2025-12-04 10:06:53.712741401 +0000 UTC m=+1459.610335154" lastFinishedPulling="2025-12-04 10:07:10.729842621 +0000 UTC m=+1476.627436384" observedRunningTime="2025-12-04 10:07:11.791729374 +0000 UTC m=+1477.689323127" watchObservedRunningTime="2025-12-04 10:07:11.81223936 +0000 UTC m=+1477.709833113" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.829044 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.871707 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.876671 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.884583 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:07:11 crc kubenswrapper[4693]: E1204 10:07:11.886694 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2dadd4-e719-4ec8-915e-683db6276f04" containerName="glance-httpd" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.886721 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2dadd4-e719-4ec8-915e-683db6276f04" containerName="glance-httpd" Dec 04 10:07:11 crc kubenswrapper[4693]: E1204 10:07:11.886739 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2dadd4-e719-4ec8-915e-683db6276f04" containerName="glance-log" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.886748 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2dadd4-e719-4ec8-915e-683db6276f04" containerName="glance-log" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.887740 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c2dadd4-e719-4ec8-915e-683db6276f04" containerName="glance-httpd" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.887774 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c2dadd4-e719-4ec8-915e-683db6276f04" containerName="glance-log" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.889095 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.891814 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.895281 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.929198 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.964554 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-logs\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.964885 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.964908 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.964939 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.965028 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.965092 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.965128 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgx9w\" (UniqueName: \"kubernetes.io/projected/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-kube-api-access-zgx9w\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.965176 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.965200 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:11 crc kubenswrapper[4693]: I1204 10:07:11.974868 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.067407 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.067482 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.067517 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgx9w\" (UniqueName: \"kubernetes.io/projected/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-kube-api-access-zgx9w\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.067551 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.067571 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.067609 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-logs\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.067637 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.067652 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.067676 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.068943 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.069042 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-logs\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.070722 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.078244 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.078285 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.082996 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.087794 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgx9w\" (UniqueName: \"kubernetes.io/projected/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-kube-api-access-zgx9w\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.088268 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.088810 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1d7ce8c-35a9-406c-9b7d-10e4976bb156-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.137580 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e1d7ce8c-35a9-406c-9b7d-10e4976bb156\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.227777 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.487238 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c2dadd4-e719-4ec8-915e-683db6276f04" path="/var/lib/kubelet/pods/0c2dadd4-e719-4ec8-915e-683db6276f04/volumes" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.488348 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb7f08d0-230c-4f83-b559-7cd16b4629ea" path="/var/lib/kubelet/pods/eb7f08d0-230c-4f83-b559-7cd16b4629ea/volumes" Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.784747 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"56bd6fe8-e97b-4c07-a204-ee44c09401b7","Type":"ContainerStarted","Data":"80a8aa6d73c70dab0c4043b572ed43bbd1eec23ae7117b2271717b8c9587a9b2"} Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.788365 4693 generic.go:334] "Generic (PLEG): container finished" podID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerID="4005382078fba3624e1a72f92a1f2dbf77c04eda77e8c99d486c3a4c6bd13e83" exitCode=0 Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.788402 4693 generic.go:334] "Generic (PLEG): container finished" podID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerID="69de9d45b5b2724e606e947200cde8a18e57c19fe4f4e1f1e80e320caee74ed8" exitCode=2 Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.788421 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5ac9c1-2338-4739-ac19-e8f9fa16ac59","Type":"ContainerDied","Data":"4005382078fba3624e1a72f92a1f2dbf77c04eda77e8c99d486c3a4c6bd13e83"} Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.788440 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5ac9c1-2338-4739-ac19-e8f9fa16ac59","Type":"ContainerDied","Data":"69de9d45b5b2724e606e947200cde8a18e57c19fe4f4e1f1e80e320caee74ed8"} Dec 04 10:07:12 crc kubenswrapper[4693]: I1204 10:07:12.820637 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:07:13 crc kubenswrapper[4693]: I1204 10:07:13.706070 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Dec 04 10:07:13 crc kubenswrapper[4693]: I1204 10:07:13.814457 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e1d7ce8c-35a9-406c-9b7d-10e4976bb156","Type":"ContainerStarted","Data":"f6513519bc8138b9c458d15edb45ec0b914bc5141672402b664ed0e3a88936ac"} Dec 04 10:07:13 crc kubenswrapper[4693]: I1204 10:07:13.814820 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e1d7ce8c-35a9-406c-9b7d-10e4976bb156","Type":"ContainerStarted","Data":"d2916c754ad2d97060dc06ce466b50f8fe520bd94b090195ed3708fbff602482"} Dec 04 10:07:13 crc kubenswrapper[4693]: I1204 10:07:13.826194 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"56bd6fe8-e97b-4c07-a204-ee44c09401b7","Type":"ContainerStarted","Data":"5ea202f75f7fbbfbe6c861d2fda02eb703fbb078a13b434b53a72dfb402f9901"} Dec 04 10:07:13 crc kubenswrapper[4693]: I1204 10:07:13.842548 4693 generic.go:334] "Generic (PLEG): container finished" podID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerID="095638f543e42621b72ed1ddc79fded1d896e3a559323a7f84022b469f9cdef9" exitCode=0 Dec 04 10:07:13 crc kubenswrapper[4693]: I1204 10:07:13.842583 4693 generic.go:334] "Generic (PLEG): container finished" podID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerID="15231f5b973c0a1d401bb6078ca3376b6b46165a7c4a1ad65db8530e4954b007" exitCode=0 Dec 04 10:07:13 crc kubenswrapper[4693]: I1204 10:07:13.842604 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5ac9c1-2338-4739-ac19-e8f9fa16ac59","Type":"ContainerDied","Data":"095638f543e42621b72ed1ddc79fded1d896e3a559323a7f84022b469f9cdef9"} Dec 04 10:07:13 crc kubenswrapper[4693]: I1204 10:07:13.842631 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5ac9c1-2338-4739-ac19-e8f9fa16ac59","Type":"ContainerDied","Data":"15231f5b973c0a1d401bb6078ca3376b6b46165a7c4a1ad65db8530e4954b007"} Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.130502 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.233219 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-log-httpd\") pod \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.233277 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-sg-core-conf-yaml\") pod \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.233371 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-combined-ca-bundle\") pod \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.233459 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-run-httpd\") pod \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.233521 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-scripts\") pod \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.233594 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-config-data\") pod \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.233622 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkwcd\" (UniqueName: \"kubernetes.io/projected/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-kube-api-access-zkwcd\") pod \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\" (UID: \"da5ac9c1-2338-4739-ac19-e8f9fa16ac59\") " Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.234867 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "da5ac9c1-2338-4739-ac19-e8f9fa16ac59" (UID: "da5ac9c1-2338-4739-ac19-e8f9fa16ac59"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.235181 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "da5ac9c1-2338-4739-ac19-e8f9fa16ac59" (UID: "da5ac9c1-2338-4739-ac19-e8f9fa16ac59"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.240655 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-kube-api-access-zkwcd" (OuterVolumeSpecName: "kube-api-access-zkwcd") pod "da5ac9c1-2338-4739-ac19-e8f9fa16ac59" (UID: "da5ac9c1-2338-4739-ac19-e8f9fa16ac59"). InnerVolumeSpecName "kube-api-access-zkwcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.247555 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-scripts" (OuterVolumeSpecName: "scripts") pod "da5ac9c1-2338-4739-ac19-e8f9fa16ac59" (UID: "da5ac9c1-2338-4739-ac19-e8f9fa16ac59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.269221 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "da5ac9c1-2338-4739-ac19-e8f9fa16ac59" (UID: "da5ac9c1-2338-4739-ac19-e8f9fa16ac59"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.325503 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da5ac9c1-2338-4739-ac19-e8f9fa16ac59" (UID: "da5ac9c1-2338-4739-ac19-e8f9fa16ac59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.336599 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.336638 4693 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.336648 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.336657 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkwcd\" (UniqueName: \"kubernetes.io/projected/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-kube-api-access-zkwcd\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.336666 4693 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.336675 4693 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.446662 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-config-data" (OuterVolumeSpecName: "config-data") pod "da5ac9c1-2338-4739-ac19-e8f9fa16ac59" (UID: "da5ac9c1-2338-4739-ac19-e8f9fa16ac59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.546162 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5ac9c1-2338-4739-ac19-e8f9fa16ac59-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.853132 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"56bd6fe8-e97b-4c07-a204-ee44c09401b7","Type":"ContainerStarted","Data":"2d2868b1ac4db21428c34c763be36a860dbefceadb3200c2dce4eca1c923daec"} Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.857430 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da5ac9c1-2338-4739-ac19-e8f9fa16ac59","Type":"ContainerDied","Data":"9b7160c05528bc9e7bf3ede952778c7f86ae410a6575abc953de85b010547966"} Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.857917 4693 scope.go:117] "RemoveContainer" containerID="4005382078fba3624e1a72f92a1f2dbf77c04eda77e8c99d486c3a4c6bd13e83" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.857476 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.861027 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e1d7ce8c-35a9-406c-9b7d-10e4976bb156","Type":"ContainerStarted","Data":"8712672ab91f733c8b14f6999ff57bcf552dfc73c0aa8e4a31d6f4aacafc14a6"} Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.875705 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.875687244 podStartE2EDuration="4.875687244s" podCreationTimestamp="2025-12-04 10:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:07:14.870704667 +0000 UTC m=+1480.768298420" watchObservedRunningTime="2025-12-04 10:07:14.875687244 +0000 UTC m=+1480.773280997" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.888636 4693 scope.go:117] "RemoveContainer" containerID="69de9d45b5b2724e606e947200cde8a18e57c19fe4f4e1f1e80e320caee74ed8" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.902230 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.902210727 podStartE2EDuration="3.902210727s" podCreationTimestamp="2025-12-04 10:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:07:14.892085047 +0000 UTC m=+1480.789678800" watchObservedRunningTime="2025-12-04 10:07:14.902210727 +0000 UTC m=+1480.799804480" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.930396 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.932509 4693 scope.go:117] "RemoveContainer" containerID="095638f543e42621b72ed1ddc79fded1d896e3a559323a7f84022b469f9cdef9" Dec 04 10:07:14 crc kubenswrapper[4693]: I1204 10:07:14.978562 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.026260 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.026607 4693 scope.go:117] "RemoveContainer" containerID="15231f5b973c0a1d401bb6078ca3376b6b46165a7c4a1ad65db8530e4954b007" Dec 04 10:07:15 crc kubenswrapper[4693]: E1204 10:07:15.026807 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerName="sg-core" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.026832 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerName="sg-core" Dec 04 10:07:15 crc kubenswrapper[4693]: E1204 10:07:15.026863 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerName="proxy-httpd" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.026872 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerName="proxy-httpd" Dec 04 10:07:15 crc kubenswrapper[4693]: E1204 10:07:15.026887 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerName="ceilometer-central-agent" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.026895 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerName="ceilometer-central-agent" Dec 04 10:07:15 crc kubenswrapper[4693]: E1204 10:07:15.026936 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerName="ceilometer-notification-agent" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.026945 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerName="ceilometer-notification-agent" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.027154 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerName="ceilometer-notification-agent" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.027169 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerName="ceilometer-central-agent" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.027186 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerName="sg-core" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.027197 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" containerName="proxy-httpd" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.029277 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.037717 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.037893 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.041779 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.162950 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-config-data\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.163015 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.163037 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.163058 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-scripts\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.163086 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b65ee0e-4039-4763-a3db-f2664e094b4d-run-httpd\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.163140 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b65ee0e-4039-4763-a3db-f2664e094b4d-log-httpd\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.163174 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm8h7\" (UniqueName: \"kubernetes.io/projected/6b65ee0e-4039-4763-a3db-f2664e094b4d-kube-api-access-bm8h7\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.264482 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm8h7\" (UniqueName: \"kubernetes.io/projected/6b65ee0e-4039-4763-a3db-f2664e094b4d-kube-api-access-bm8h7\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.264595 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-config-data\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.264639 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.264656 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.264678 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-scripts\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.264704 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b65ee0e-4039-4763-a3db-f2664e094b4d-run-httpd\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.264760 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b65ee0e-4039-4763-a3db-f2664e094b4d-log-httpd\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.265255 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b65ee0e-4039-4763-a3db-f2664e094b4d-log-httpd\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.265907 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b65ee0e-4039-4763-a3db-f2664e094b4d-run-httpd\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.270843 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-scripts\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.270895 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.281196 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-config-data\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.283657 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm8h7\" (UniqueName: \"kubernetes.io/projected/6b65ee0e-4039-4763-a3db-f2664e094b4d-kube-api-access-bm8h7\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.296485 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.359875 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.817202 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:07:15 crc kubenswrapper[4693]: W1204 10:07:15.829445 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b65ee0e_4039_4763_a3db_f2664e094b4d.slice/crio-707dd0e689619b2bc4a4c5410022170561539694926c183e86a4b462e248f409 WatchSource:0}: Error finding container 707dd0e689619b2bc4a4c5410022170561539694926c183e86a4b462e248f409: Status 404 returned error can't find the container with id 707dd0e689619b2bc4a4c5410022170561539694926c183e86a4b462e248f409 Dec 04 10:07:15 crc kubenswrapper[4693]: I1204 10:07:15.872050 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b65ee0e-4039-4763-a3db-f2664e094b4d","Type":"ContainerStarted","Data":"707dd0e689619b2bc4a4c5410022170561539694926c183e86a4b462e248f409"} Dec 04 10:07:16 crc kubenswrapper[4693]: I1204 10:07:16.301531 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:07:16 crc kubenswrapper[4693]: I1204 10:07:16.472557 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da5ac9c1-2338-4739-ac19-e8f9fa16ac59" path="/var/lib/kubelet/pods/da5ac9c1-2338-4739-ac19-e8f9fa16ac59/volumes" Dec 04 10:07:17 crc kubenswrapper[4693]: I1204 10:07:17.296487 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:21.267620 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:21.268412 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:21.306703 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:21.333702 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:21.938069 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:21.938385 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:22.228088 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:22.228155 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:22.270257 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:22.272814 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:22.272849 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:22.272876 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:22.273575 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b7bd03640b7e4a33a647c5d1603e98e993284c3b724300f1b3ae4227fa75eb8c"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:22.273650 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://b7bd03640b7e4a33a647c5d1603e98e993284c3b724300f1b3ae4227fa75eb8c" gracePeriod=600 Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:22.285141 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:22.551656 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="487df7df-e43a-48a6-8350-6b9804d13e39" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.187:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:22.962389 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b65ee0e-4039-4763-a3db-f2664e094b4d","Type":"ContainerStarted","Data":"042183c26b92a77aba1fc501f43fa15e0b052528d37a4ecd72bea92ed9df1800"} Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:22.962950 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:22.962981 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:23.984260 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="b7bd03640b7e4a33a647c5d1603e98e993284c3b724300f1b3ae4227fa75eb8c" exitCode=0 Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:23.984405 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"b7bd03640b7e4a33a647c5d1603e98e993284c3b724300f1b3ae4227fa75eb8c"} Dec 04 10:07:23 crc kubenswrapper[4693]: I1204 10:07:23.985250 4693 scope.go:117] "RemoveContainer" containerID="17b9e6dc7a80fb27b6e0a13555809c35b6f6158654239504ab57d225574ce7bf" Dec 04 10:07:24 crc kubenswrapper[4693]: I1204 10:07:24.663065 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 04 10:07:24 crc kubenswrapper[4693]: I1204 10:07:24.749939 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 10:07:24 crc kubenswrapper[4693]: I1204 10:07:24.809491 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 10:07:24 crc kubenswrapper[4693]: I1204 10:07:24.809602 4693 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 10:07:24 crc kubenswrapper[4693]: I1204 10:07:24.998436 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316"} Dec 04 10:07:25 crc kubenswrapper[4693]: I1204 10:07:25.000888 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b65ee0e-4039-4763-a3db-f2664e094b4d","Type":"ContainerStarted","Data":"b5f390215541f6ccb8615fe562362b6d55b44350b704d4bcb58afde0aafa673d"} Dec 04 10:07:25 crc kubenswrapper[4693]: I1204 10:07:25.002805 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="9a4f028e-3364-435d-8fef-234e98c9b6a1" containerName="manila-share" containerID="cri-o://7de135c7ef6f1f1116dd516ca81476601668d87b6f54596580053971c69543ba" gracePeriod=30 Dec 04 10:07:25 crc kubenswrapper[4693]: I1204 10:07:25.002968 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ttkgx" event={"ID":"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67","Type":"ContainerStarted","Data":"ad02b5a091f3913441761bde718ecf4dac9d0711387f15d0dc6554de601b3e14"} Dec 04 10:07:25 crc kubenswrapper[4693]: I1204 10:07:25.003346 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="9a4f028e-3364-435d-8fef-234e98c9b6a1" containerName="probe" containerID="cri-o://3b2406d3df026daccf3ccdbc2efaaca0950bdaaf807df6861733abf3457c4237" gracePeriod=30 Dec 04 10:07:25 crc kubenswrapper[4693]: I1204 10:07:25.038255 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-ttkgx" podStartSLOduration=2.039840238 podStartE2EDuration="45.038237773s" podCreationTimestamp="2025-12-04 10:06:40 +0000 UTC" firstStartedPulling="2025-12-04 10:06:40.934874684 +0000 UTC m=+1446.832468447" lastFinishedPulling="2025-12-04 10:07:23.933272229 +0000 UTC m=+1489.830865982" observedRunningTime="2025-12-04 10:07:25.034710845 +0000 UTC m=+1490.932304588" watchObservedRunningTime="2025-12-04 10:07:25.038237773 +0000 UTC m=+1490.935831526" Dec 04 10:07:25 crc kubenswrapper[4693]: I1204 10:07:25.061742 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 10:07:26 crc kubenswrapper[4693]: I1204 10:07:26.409062 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 10:07:26 crc kubenswrapper[4693]: I1204 10:07:26.409596 4693 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 10:07:27 crc kubenswrapper[4693]: I1204 10:07:27.019922 4693 generic.go:334] "Generic (PLEG): container finished" podID="9a4f028e-3364-435d-8fef-234e98c9b6a1" containerID="3b2406d3df026daccf3ccdbc2efaaca0950bdaaf807df6861733abf3457c4237" exitCode=0 Dec 04 10:07:27 crc kubenswrapper[4693]: I1204 10:07:27.020390 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"9a4f028e-3364-435d-8fef-234e98c9b6a1","Type":"ContainerDied","Data":"3b2406d3df026daccf3ccdbc2efaaca0950bdaaf807df6861733abf3457c4237"} Dec 04 10:07:27 crc kubenswrapper[4693]: I1204 10:07:27.323673 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 10:07:30 crc kubenswrapper[4693]: I1204 10:07:30.054594 4693 generic.go:334] "Generic (PLEG): container finished" podID="9a4f028e-3364-435d-8fef-234e98c9b6a1" containerID="7de135c7ef6f1f1116dd516ca81476601668d87b6f54596580053971c69543ba" exitCode=1 Dec 04 10:07:30 crc kubenswrapper[4693]: I1204 10:07:30.054677 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"9a4f028e-3364-435d-8fef-234e98c9b6a1","Type":"ContainerDied","Data":"7de135c7ef6f1f1116dd516ca81476601668d87b6f54596580053971c69543ba"} Dec 04 10:07:31 crc kubenswrapper[4693]: I1204 10:07:31.947434 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.383645 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.545608 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-scripts\") pod \"9a4f028e-3364-435d-8fef-234e98c9b6a1\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.546259 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-config-data-custom\") pod \"9a4f028e-3364-435d-8fef-234e98c9b6a1\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.546493 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-config-data\") pod \"9a4f028e-3364-435d-8fef-234e98c9b6a1\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.546568 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a4f028e-3364-435d-8fef-234e98c9b6a1-etc-machine-id\") pod \"9a4f028e-3364-435d-8fef-234e98c9b6a1\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.546609 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9a4f028e-3364-435d-8fef-234e98c9b6a1-ceph\") pod \"9a4f028e-3364-435d-8fef-234e98c9b6a1\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.546650 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-combined-ca-bundle\") pod \"9a4f028e-3364-435d-8fef-234e98c9b6a1\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.546696 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/9a4f028e-3364-435d-8fef-234e98c9b6a1-var-lib-manila\") pod \"9a4f028e-3364-435d-8fef-234e98c9b6a1\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.546768 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nv27\" (UniqueName: \"kubernetes.io/projected/9a4f028e-3364-435d-8fef-234e98c9b6a1-kube-api-access-8nv27\") pod \"9a4f028e-3364-435d-8fef-234e98c9b6a1\" (UID: \"9a4f028e-3364-435d-8fef-234e98c9b6a1\") " Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.547090 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a4f028e-3364-435d-8fef-234e98c9b6a1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9a4f028e-3364-435d-8fef-234e98c9b6a1" (UID: "9a4f028e-3364-435d-8fef-234e98c9b6a1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.547450 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a4f028e-3364-435d-8fef-234e98c9b6a1-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "9a4f028e-3364-435d-8fef-234e98c9b6a1" (UID: "9a4f028e-3364-435d-8fef-234e98c9b6a1"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.548186 4693 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a4f028e-3364-435d-8fef-234e98c9b6a1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.548201 4693 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/9a4f028e-3364-435d-8fef-234e98c9b6a1-var-lib-manila\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.553559 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9a4f028e-3364-435d-8fef-234e98c9b6a1" (UID: "9a4f028e-3364-435d-8fef-234e98c9b6a1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.553611 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a4f028e-3364-435d-8fef-234e98c9b6a1-ceph" (OuterVolumeSpecName: "ceph") pod "9a4f028e-3364-435d-8fef-234e98c9b6a1" (UID: "9a4f028e-3364-435d-8fef-234e98c9b6a1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.553692 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-scripts" (OuterVolumeSpecName: "scripts") pod "9a4f028e-3364-435d-8fef-234e98c9b6a1" (UID: "9a4f028e-3364-435d-8fef-234e98c9b6a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.553905 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a4f028e-3364-435d-8fef-234e98c9b6a1-kube-api-access-8nv27" (OuterVolumeSpecName: "kube-api-access-8nv27") pod "9a4f028e-3364-435d-8fef-234e98c9b6a1" (UID: "9a4f028e-3364-435d-8fef-234e98c9b6a1"). InnerVolumeSpecName "kube-api-access-8nv27". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.635106 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a4f028e-3364-435d-8fef-234e98c9b6a1" (UID: "9a4f028e-3364-435d-8fef-234e98c9b6a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.650771 4693 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9a4f028e-3364-435d-8fef-234e98c9b6a1-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.650829 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.650847 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nv27\" (UniqueName: \"kubernetes.io/projected/9a4f028e-3364-435d-8fef-234e98c9b6a1-kube-api-access-8nv27\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.650862 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.650876 4693 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.678716 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-config-data" (OuterVolumeSpecName: "config-data") pod "9a4f028e-3364-435d-8fef-234e98c9b6a1" (UID: "9a4f028e-3364-435d-8fef-234e98c9b6a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:33 crc kubenswrapper[4693]: I1204 10:07:33.752829 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4f028e-3364-435d-8fef-234e98c9b6a1-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.095835 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"9a4f028e-3364-435d-8fef-234e98c9b6a1","Type":"ContainerDied","Data":"a982cb378bff05e757241bf4760539dae613ac269980415b97544a6ae411c973"} Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.095883 4693 scope.go:117] "RemoveContainer" containerID="3b2406d3df026daccf3ccdbc2efaaca0950bdaaf807df6861733abf3457c4237" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.095893 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.102174 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b65ee0e-4039-4763-a3db-f2664e094b4d","Type":"ContainerStarted","Data":"4efec787434d20c4ff6d94c1f40723fb60b47025565bd30cbc856491bf9bbabe"} Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.122307 4693 scope.go:117] "RemoveContainer" containerID="7de135c7ef6f1f1116dd516ca81476601668d87b6f54596580053971c69543ba" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.143858 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.154792 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.171692 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 10:07:34 crc kubenswrapper[4693]: E1204 10:07:34.172169 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a4f028e-3364-435d-8fef-234e98c9b6a1" containerName="probe" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.172187 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a4f028e-3364-435d-8fef-234e98c9b6a1" containerName="probe" Dec 04 10:07:34 crc kubenswrapper[4693]: E1204 10:07:34.172208 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a4f028e-3364-435d-8fef-234e98c9b6a1" containerName="manila-share" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.172214 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a4f028e-3364-435d-8fef-234e98c9b6a1" containerName="manila-share" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.172407 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a4f028e-3364-435d-8fef-234e98c9b6a1" containerName="probe" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.172424 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a4f028e-3364-435d-8fef-234e98c9b6a1" containerName="manila-share" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.173514 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.175598 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.200381 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.365224 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33b8b8b4-d56e-4c4c-9e87-95d334534e74-config-data\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.365368 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33b8b8b4-d56e-4c4c-9e87-95d334534e74-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.365440 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/33b8b8b4-d56e-4c4c-9e87-95d334534e74-ceph\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.365528 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/33b8b8b4-d56e-4c4c-9e87-95d334534e74-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.365600 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33b8b8b4-d56e-4c4c-9e87-95d334534e74-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.365668 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkq4n\" (UniqueName: \"kubernetes.io/projected/33b8b8b4-d56e-4c4c-9e87-95d334534e74-kube-api-access-vkq4n\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.365721 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33b8b8b4-d56e-4c4c-9e87-95d334534e74-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.365754 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33b8b8b4-d56e-4c4c-9e87-95d334534e74-scripts\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.467809 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkq4n\" (UniqueName: \"kubernetes.io/projected/33b8b8b4-d56e-4c4c-9e87-95d334534e74-kube-api-access-vkq4n\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.467873 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33b8b8b4-d56e-4c4c-9e87-95d334534e74-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.467896 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33b8b8b4-d56e-4c4c-9e87-95d334534e74-scripts\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.467924 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33b8b8b4-d56e-4c4c-9e87-95d334534e74-config-data\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.467968 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33b8b8b4-d56e-4c4c-9e87-95d334534e74-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.467998 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/33b8b8b4-d56e-4c4c-9e87-95d334534e74-ceph\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.468051 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/33b8b8b4-d56e-4c4c-9e87-95d334534e74-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.468091 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33b8b8b4-d56e-4c4c-9e87-95d334534e74-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.468271 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33b8b8b4-d56e-4c4c-9e87-95d334534e74-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.473051 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/33b8b8b4-d56e-4c4c-9e87-95d334534e74-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.475515 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33b8b8b4-d56e-4c4c-9e87-95d334534e74-scripts\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.475599 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/33b8b8b4-d56e-4c4c-9e87-95d334534e74-ceph\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.477327 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.483494 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33b8b8b4-d56e-4c4c-9e87-95d334534e74-config-data\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.489891 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33b8b8b4-d56e-4c4c-9e87-95d334534e74-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.490550 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a4f028e-3364-435d-8fef-234e98c9b6a1" path="/var/lib/kubelet/pods/9a4f028e-3364-435d-8fef-234e98c9b6a1/volumes" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.495914 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkq4n\" (UniqueName: \"kubernetes.io/projected/33b8b8b4-d56e-4c4c-9e87-95d334534e74-kube-api-access-vkq4n\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.515963 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33b8b8b4-d56e-4c4c-9e87-95d334534e74-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"33b8b8b4-d56e-4c4c-9e87-95d334534e74\") " pod="openstack/manila-share-share1-0" Dec 04 10:07:34 crc kubenswrapper[4693]: I1204 10:07:34.808510 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 10:07:35 crc kubenswrapper[4693]: I1204 10:07:35.142579 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b65ee0e-4039-4763-a3db-f2664e094b4d","Type":"ContainerStarted","Data":"51c6abb0edde54f09d54781e8cc95fa10de4fcd2700ec028e17016d4def37194"} Dec 04 10:07:35 crc kubenswrapper[4693]: I1204 10:07:35.142797 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerName="ceilometer-central-agent" containerID="cri-o://042183c26b92a77aba1fc501f43fa15e0b052528d37a4ecd72bea92ed9df1800" gracePeriod=30 Dec 04 10:07:35 crc kubenswrapper[4693]: I1204 10:07:35.143057 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:07:35 crc kubenswrapper[4693]: I1204 10:07:35.143086 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerName="proxy-httpd" containerID="cri-o://51c6abb0edde54f09d54781e8cc95fa10de4fcd2700ec028e17016d4def37194" gracePeriod=30 Dec 04 10:07:35 crc kubenswrapper[4693]: I1204 10:07:35.143168 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerName="sg-core" containerID="cri-o://4efec787434d20c4ff6d94c1f40723fb60b47025565bd30cbc856491bf9bbabe" gracePeriod=30 Dec 04 10:07:35 crc kubenswrapper[4693]: I1204 10:07:35.143197 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerName="ceilometer-notification-agent" containerID="cri-o://b5f390215541f6ccb8615fe562362b6d55b44350b704d4bcb58afde0aafa673d" gracePeriod=30 Dec 04 10:07:35 crc kubenswrapper[4693]: I1204 10:07:35.176415 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.319468799 podStartE2EDuration="21.17639798s" podCreationTimestamp="2025-12-04 10:07:14 +0000 UTC" firstStartedPulling="2025-12-04 10:07:15.831217531 +0000 UTC m=+1481.728811284" lastFinishedPulling="2025-12-04 10:07:34.688146712 +0000 UTC m=+1500.585740465" observedRunningTime="2025-12-04 10:07:35.165111458 +0000 UTC m=+1501.062705211" watchObservedRunningTime="2025-12-04 10:07:35.17639798 +0000 UTC m=+1501.073991733" Dec 04 10:07:35 crc kubenswrapper[4693]: I1204 10:07:35.397884 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 10:07:35 crc kubenswrapper[4693]: W1204 10:07:35.401052 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33b8b8b4_d56e_4c4c_9e87_95d334534e74.slice/crio-f409d0c0e946857acd25844d8662f7d7e595f96a33808ad51bd8fde79086dab4 WatchSource:0}: Error finding container f409d0c0e946857acd25844d8662f7d7e595f96a33808ad51bd8fde79086dab4: Status 404 returned error can't find the container with id f409d0c0e946857acd25844d8662f7d7e595f96a33808ad51bd8fde79086dab4 Dec 04 10:07:36 crc kubenswrapper[4693]: I1204 10:07:36.168908 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"33b8b8b4-d56e-4c4c-9e87-95d334534e74","Type":"ContainerStarted","Data":"dd57cae9867e717d4de263b4a350a2e219e7e8debad21257880b49fc79653599"} Dec 04 10:07:36 crc kubenswrapper[4693]: I1204 10:07:36.169646 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"33b8b8b4-d56e-4c4c-9e87-95d334534e74","Type":"ContainerStarted","Data":"f409d0c0e946857acd25844d8662f7d7e595f96a33808ad51bd8fde79086dab4"} Dec 04 10:07:36 crc kubenswrapper[4693]: I1204 10:07:36.177452 4693 generic.go:334] "Generic (PLEG): container finished" podID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerID="51c6abb0edde54f09d54781e8cc95fa10de4fcd2700ec028e17016d4def37194" exitCode=0 Dec 04 10:07:36 crc kubenswrapper[4693]: I1204 10:07:36.177501 4693 generic.go:334] "Generic (PLEG): container finished" podID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerID="4efec787434d20c4ff6d94c1f40723fb60b47025565bd30cbc856491bf9bbabe" exitCode=2 Dec 04 10:07:36 crc kubenswrapper[4693]: I1204 10:07:36.177510 4693 generic.go:334] "Generic (PLEG): container finished" podID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerID="b5f390215541f6ccb8615fe562362b6d55b44350b704d4bcb58afde0aafa673d" exitCode=0 Dec 04 10:07:36 crc kubenswrapper[4693]: I1204 10:07:36.177532 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b65ee0e-4039-4763-a3db-f2664e094b4d","Type":"ContainerDied","Data":"51c6abb0edde54f09d54781e8cc95fa10de4fcd2700ec028e17016d4def37194"} Dec 04 10:07:36 crc kubenswrapper[4693]: I1204 10:07:36.177560 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b65ee0e-4039-4763-a3db-f2664e094b4d","Type":"ContainerDied","Data":"4efec787434d20c4ff6d94c1f40723fb60b47025565bd30cbc856491bf9bbabe"} Dec 04 10:07:36 crc kubenswrapper[4693]: I1204 10:07:36.177572 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b65ee0e-4039-4763-a3db-f2664e094b4d","Type":"ContainerDied","Data":"b5f390215541f6ccb8615fe562362b6d55b44350b704d4bcb58afde0aafa673d"} Dec 04 10:07:37 crc kubenswrapper[4693]: I1204 10:07:37.193154 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"33b8b8b4-d56e-4c4c-9e87-95d334534e74","Type":"ContainerStarted","Data":"f604a537a1a908fb8b439650f533df0a845b06ffc86b5d26c3c9dcd4334a782f"} Dec 04 10:07:37 crc kubenswrapper[4693]: I1204 10:07:37.237119 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.237032077 podStartE2EDuration="3.237032077s" podCreationTimestamp="2025-12-04 10:07:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:07:37.226307911 +0000 UTC m=+1503.123901664" watchObservedRunningTime="2025-12-04 10:07:37.237032077 +0000 UTC m=+1503.134625830" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.014382 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.130560 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-scripts\") pod \"6b65ee0e-4039-4763-a3db-f2664e094b4d\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.130709 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b65ee0e-4039-4763-a3db-f2664e094b4d-log-httpd\") pod \"6b65ee0e-4039-4763-a3db-f2664e094b4d\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.130855 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm8h7\" (UniqueName: \"kubernetes.io/projected/6b65ee0e-4039-4763-a3db-f2664e094b4d-kube-api-access-bm8h7\") pod \"6b65ee0e-4039-4763-a3db-f2664e094b4d\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.130943 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-config-data\") pod \"6b65ee0e-4039-4763-a3db-f2664e094b4d\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.131107 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-sg-core-conf-yaml\") pod \"6b65ee0e-4039-4763-a3db-f2664e094b4d\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.131142 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b65ee0e-4039-4763-a3db-f2664e094b4d-run-httpd\") pod \"6b65ee0e-4039-4763-a3db-f2664e094b4d\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.131408 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-combined-ca-bundle\") pod \"6b65ee0e-4039-4763-a3db-f2664e094b4d\" (UID: \"6b65ee0e-4039-4763-a3db-f2664e094b4d\") " Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.131598 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b65ee0e-4039-4763-a3db-f2664e094b4d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6b65ee0e-4039-4763-a3db-f2664e094b4d" (UID: "6b65ee0e-4039-4763-a3db-f2664e094b4d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.132261 4693 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b65ee0e-4039-4763-a3db-f2664e094b4d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.133223 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b65ee0e-4039-4763-a3db-f2664e094b4d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6b65ee0e-4039-4763-a3db-f2664e094b4d" (UID: "6b65ee0e-4039-4763-a3db-f2664e094b4d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.152012 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b65ee0e-4039-4763-a3db-f2664e094b4d-kube-api-access-bm8h7" (OuterVolumeSpecName: "kube-api-access-bm8h7") pod "6b65ee0e-4039-4763-a3db-f2664e094b4d" (UID: "6b65ee0e-4039-4763-a3db-f2664e094b4d"). InnerVolumeSpecName "kube-api-access-bm8h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.161101 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-scripts" (OuterVolumeSpecName: "scripts") pod "6b65ee0e-4039-4763-a3db-f2664e094b4d" (UID: "6b65ee0e-4039-4763-a3db-f2664e094b4d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.187636 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6b65ee0e-4039-4763-a3db-f2664e094b4d" (UID: "6b65ee0e-4039-4763-a3db-f2664e094b4d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.228984 4693 generic.go:334] "Generic (PLEG): container finished" podID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerID="042183c26b92a77aba1fc501f43fa15e0b052528d37a4ecd72bea92ed9df1800" exitCode=0 Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.229082 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b65ee0e-4039-4763-a3db-f2664e094b4d","Type":"ContainerDied","Data":"042183c26b92a77aba1fc501f43fa15e0b052528d37a4ecd72bea92ed9df1800"} Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.229104 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.229139 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b65ee0e-4039-4763-a3db-f2664e094b4d","Type":"ContainerDied","Data":"707dd0e689619b2bc4a4c5410022170561539694926c183e86a4b462e248f409"} Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.229175 4693 scope.go:117] "RemoveContainer" containerID="51c6abb0edde54f09d54781e8cc95fa10de4fcd2700ec028e17016d4def37194" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.234248 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.234283 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm8h7\" (UniqueName: \"kubernetes.io/projected/6b65ee0e-4039-4763-a3db-f2664e094b4d-kube-api-access-bm8h7\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.234301 4693 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.234311 4693 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b65ee0e-4039-4763-a3db-f2664e094b4d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.276342 4693 scope.go:117] "RemoveContainer" containerID="4efec787434d20c4ff6d94c1f40723fb60b47025565bd30cbc856491bf9bbabe" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.302570 4693 scope.go:117] "RemoveContainer" containerID="b5f390215541f6ccb8615fe562362b6d55b44350b704d4bcb58afde0aafa673d" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.336085 4693 scope.go:117] "RemoveContainer" containerID="042183c26b92a77aba1fc501f43fa15e0b052528d37a4ecd72bea92ed9df1800" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.359791 4693 scope.go:117] "RemoveContainer" containerID="51c6abb0edde54f09d54781e8cc95fa10de4fcd2700ec028e17016d4def37194" Dec 04 10:07:40 crc kubenswrapper[4693]: E1204 10:07:40.361466 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c6abb0edde54f09d54781e8cc95fa10de4fcd2700ec028e17016d4def37194\": container with ID starting with 51c6abb0edde54f09d54781e8cc95fa10de4fcd2700ec028e17016d4def37194 not found: ID does not exist" containerID="51c6abb0edde54f09d54781e8cc95fa10de4fcd2700ec028e17016d4def37194" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.361503 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c6abb0edde54f09d54781e8cc95fa10de4fcd2700ec028e17016d4def37194"} err="failed to get container status \"51c6abb0edde54f09d54781e8cc95fa10de4fcd2700ec028e17016d4def37194\": rpc error: code = NotFound desc = could not find container \"51c6abb0edde54f09d54781e8cc95fa10de4fcd2700ec028e17016d4def37194\": container with ID starting with 51c6abb0edde54f09d54781e8cc95fa10de4fcd2700ec028e17016d4def37194 not found: ID does not exist" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.361531 4693 scope.go:117] "RemoveContainer" containerID="4efec787434d20c4ff6d94c1f40723fb60b47025565bd30cbc856491bf9bbabe" Dec 04 10:07:40 crc kubenswrapper[4693]: E1204 10:07:40.361961 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4efec787434d20c4ff6d94c1f40723fb60b47025565bd30cbc856491bf9bbabe\": container with ID starting with 4efec787434d20c4ff6d94c1f40723fb60b47025565bd30cbc856491bf9bbabe not found: ID does not exist" containerID="4efec787434d20c4ff6d94c1f40723fb60b47025565bd30cbc856491bf9bbabe" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.361988 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4efec787434d20c4ff6d94c1f40723fb60b47025565bd30cbc856491bf9bbabe"} err="failed to get container status \"4efec787434d20c4ff6d94c1f40723fb60b47025565bd30cbc856491bf9bbabe\": rpc error: code = NotFound desc = could not find container \"4efec787434d20c4ff6d94c1f40723fb60b47025565bd30cbc856491bf9bbabe\": container with ID starting with 4efec787434d20c4ff6d94c1f40723fb60b47025565bd30cbc856491bf9bbabe not found: ID does not exist" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.362003 4693 scope.go:117] "RemoveContainer" containerID="b5f390215541f6ccb8615fe562362b6d55b44350b704d4bcb58afde0aafa673d" Dec 04 10:07:40 crc kubenswrapper[4693]: E1204 10:07:40.362459 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5f390215541f6ccb8615fe562362b6d55b44350b704d4bcb58afde0aafa673d\": container with ID starting with b5f390215541f6ccb8615fe562362b6d55b44350b704d4bcb58afde0aafa673d not found: ID does not exist" containerID="b5f390215541f6ccb8615fe562362b6d55b44350b704d4bcb58afde0aafa673d" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.362484 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5f390215541f6ccb8615fe562362b6d55b44350b704d4bcb58afde0aafa673d"} err="failed to get container status \"b5f390215541f6ccb8615fe562362b6d55b44350b704d4bcb58afde0aafa673d\": rpc error: code = NotFound desc = could not find container \"b5f390215541f6ccb8615fe562362b6d55b44350b704d4bcb58afde0aafa673d\": container with ID starting with b5f390215541f6ccb8615fe562362b6d55b44350b704d4bcb58afde0aafa673d not found: ID does not exist" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.362501 4693 scope.go:117] "RemoveContainer" containerID="042183c26b92a77aba1fc501f43fa15e0b052528d37a4ecd72bea92ed9df1800" Dec 04 10:07:40 crc kubenswrapper[4693]: E1204 10:07:40.362773 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"042183c26b92a77aba1fc501f43fa15e0b052528d37a4ecd72bea92ed9df1800\": container with ID starting with 042183c26b92a77aba1fc501f43fa15e0b052528d37a4ecd72bea92ed9df1800 not found: ID does not exist" containerID="042183c26b92a77aba1fc501f43fa15e0b052528d37a4ecd72bea92ed9df1800" Dec 04 10:07:40 crc kubenswrapper[4693]: I1204 10:07:40.362797 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"042183c26b92a77aba1fc501f43fa15e0b052528d37a4ecd72bea92ed9df1800"} err="failed to get container status \"042183c26b92a77aba1fc501f43fa15e0b052528d37a4ecd72bea92ed9df1800\": rpc error: code = NotFound desc = could not find container \"042183c26b92a77aba1fc501f43fa15e0b052528d37a4ecd72bea92ed9df1800\": container with ID starting with 042183c26b92a77aba1fc501f43fa15e0b052528d37a4ecd72bea92ed9df1800 not found: ID does not exist" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.007776 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b65ee0e-4039-4763-a3db-f2664e094b4d" (UID: "6b65ee0e-4039-4763-a3db-f2664e094b4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.054751 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.065754 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-config-data" (OuterVolumeSpecName: "config-data") pod "6b65ee0e-4039-4763-a3db-f2664e094b4d" (UID: "6b65ee0e-4039-4763-a3db-f2664e094b4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.157500 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b65ee0e-4039-4763-a3db-f2664e094b4d-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.192779 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.204638 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.240158 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:07:41 crc kubenswrapper[4693]: E1204 10:07:41.240787 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerName="sg-core" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.240810 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerName="sg-core" Dec 04 10:07:41 crc kubenswrapper[4693]: E1204 10:07:41.240842 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerName="proxy-httpd" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.240849 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerName="proxy-httpd" Dec 04 10:07:41 crc kubenswrapper[4693]: E1204 10:07:41.240861 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerName="ceilometer-central-agent" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.240867 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerName="ceilometer-central-agent" Dec 04 10:07:41 crc kubenswrapper[4693]: E1204 10:07:41.240877 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerName="ceilometer-notification-agent" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.240883 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerName="ceilometer-notification-agent" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.241098 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerName="ceilometer-notification-agent" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.241118 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerName="sg-core" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.241133 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerName="proxy-httpd" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.241147 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b65ee0e-4039-4763-a3db-f2664e094b4d" containerName="ceilometer-central-agent" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.245698 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.249059 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.253001 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.264725 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.363244 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.363345 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-config-data\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.363373 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-run-httpd\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.363459 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.363494 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-scripts\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.363542 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-log-httpd\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.363592 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvvdj\" (UniqueName: \"kubernetes.io/projected/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-kube-api-access-gvvdj\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.466625 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.466829 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-scripts\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.467067 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-log-httpd\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.467295 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvvdj\" (UniqueName: \"kubernetes.io/projected/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-kube-api-access-gvvdj\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.467403 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.467595 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-config-data\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.467656 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-run-httpd\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.468169 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-log-httpd\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.468248 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-run-httpd\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.476217 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-config-data\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.480077 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.481190 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.481790 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-scripts\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.495814 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvvdj\" (UniqueName: \"kubernetes.io/projected/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-kube-api-access-gvvdj\") pod \"ceilometer-0\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " pod="openstack/ceilometer-0" Dec 04 10:07:41 crc kubenswrapper[4693]: I1204 10:07:41.633787 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:07:42 crc kubenswrapper[4693]: I1204 10:07:42.133011 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:07:42 crc kubenswrapper[4693]: W1204 10:07:42.155664 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cd9bb4f_6b8f_4a3f_bc5a_a82cc88e8248.slice/crio-4d9ab12cab265aa7922b58c0c142c4761077082230435a2f5076519483319eb9 WatchSource:0}: Error finding container 4d9ab12cab265aa7922b58c0c142c4761077082230435a2f5076519483319eb9: Status 404 returned error can't find the container with id 4d9ab12cab265aa7922b58c0c142c4761077082230435a2f5076519483319eb9 Dec 04 10:07:42 crc kubenswrapper[4693]: I1204 10:07:42.274722 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248","Type":"ContainerStarted","Data":"4d9ab12cab265aa7922b58c0c142c4761077082230435a2f5076519483319eb9"} Dec 04 10:07:42 crc kubenswrapper[4693]: I1204 10:07:42.476182 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b65ee0e-4039-4763-a3db-f2664e094b4d" path="/var/lib/kubelet/pods/6b65ee0e-4039-4763-a3db-f2664e094b4d/volumes" Dec 04 10:07:43 crc kubenswrapper[4693]: I1204 10:07:43.287977 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248","Type":"ContainerStarted","Data":"3ccbaf0764a168a9e71dd4d812e12e6c058a93386431a786218e021d7c672146"} Dec 04 10:07:44 crc kubenswrapper[4693]: I1204 10:07:44.299181 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248","Type":"ContainerStarted","Data":"c7601927361f0bbfa04e814d3219bf8c4c5f1e8f0ae8a6374893a97440634bd6"} Dec 04 10:07:44 crc kubenswrapper[4693]: I1204 10:07:44.809129 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 04 10:07:45 crc kubenswrapper[4693]: I1204 10:07:45.314930 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248","Type":"ContainerStarted","Data":"18db01b508dfa078bf03a51781b65f90b8bfd2b9074bc5562fbb8fbf7d37b67b"} Dec 04 10:07:46 crc kubenswrapper[4693]: I1204 10:07:46.327647 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248","Type":"ContainerStarted","Data":"6459ee2ef2d5d3717a9c5eb8672cf6b91f5d6a440b1460b4495184431e6894af"} Dec 04 10:07:46 crc kubenswrapper[4693]: I1204 10:07:46.327907 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:07:46 crc kubenswrapper[4693]: I1204 10:07:46.329719 4693 generic.go:334] "Generic (PLEG): container finished" podID="b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67" containerID="ad02b5a091f3913441761bde718ecf4dac9d0711387f15d0dc6554de601b3e14" exitCode=0 Dec 04 10:07:46 crc kubenswrapper[4693]: I1204 10:07:46.329760 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ttkgx" event={"ID":"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67","Type":"ContainerDied","Data":"ad02b5a091f3913441761bde718ecf4dac9d0711387f15d0dc6554de601b3e14"} Dec 04 10:07:46 crc kubenswrapper[4693]: I1204 10:07:46.355923 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.05556525 podStartE2EDuration="5.355903664s" podCreationTimestamp="2025-12-04 10:07:41 +0000 UTC" firstStartedPulling="2025-12-04 10:07:42.1596627 +0000 UTC m=+1508.057256463" lastFinishedPulling="2025-12-04 10:07:45.460001104 +0000 UTC m=+1511.357594877" observedRunningTime="2025-12-04 10:07:46.348045197 +0000 UTC m=+1512.245638960" watchObservedRunningTime="2025-12-04 10:07:46.355903664 +0000 UTC m=+1512.253497417" Dec 04 10:07:47 crc kubenswrapper[4693]: I1204 10:07:47.705249 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ttkgx" Dec 04 10:07:47 crc kubenswrapper[4693]: I1204 10:07:47.711394 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-config-data\") pod \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\" (UID: \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\") " Dec 04 10:07:47 crc kubenswrapper[4693]: I1204 10:07:47.711616 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-combined-ca-bundle\") pod \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\" (UID: \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\") " Dec 04 10:07:47 crc kubenswrapper[4693]: I1204 10:07:47.711656 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldd4s\" (UniqueName: \"kubernetes.io/projected/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-kube-api-access-ldd4s\") pod \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\" (UID: \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\") " Dec 04 10:07:47 crc kubenswrapper[4693]: I1204 10:07:47.711728 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-scripts\") pod \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\" (UID: \"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67\") " Dec 04 10:07:47 crc kubenswrapper[4693]: I1204 10:07:47.719613 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-scripts" (OuterVolumeSpecName: "scripts") pod "b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67" (UID: "b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:47 crc kubenswrapper[4693]: I1204 10:07:47.721507 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-kube-api-access-ldd4s" (OuterVolumeSpecName: "kube-api-access-ldd4s") pod "b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67" (UID: "b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67"). InnerVolumeSpecName "kube-api-access-ldd4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:07:47 crc kubenswrapper[4693]: I1204 10:07:47.757348 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67" (UID: "b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:47 crc kubenswrapper[4693]: I1204 10:07:47.784510 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-config-data" (OuterVolumeSpecName: "config-data") pod "b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67" (UID: "b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:47 crc kubenswrapper[4693]: I1204 10:07:47.814737 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:47 crc kubenswrapper[4693]: I1204 10:07:47.815080 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldd4s\" (UniqueName: \"kubernetes.io/projected/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-kube-api-access-ldd4s\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:47 crc kubenswrapper[4693]: I1204 10:07:47.815177 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:47 crc kubenswrapper[4693]: I1204 10:07:47.815250 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.349169 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ttkgx" event={"ID":"b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67","Type":"ContainerDied","Data":"28219fe95efd9efcdea703e667f9d48b949443710076bedddac8c49088751127"} Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.349217 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28219fe95efd9efcdea703e667f9d48b949443710076bedddac8c49088751127" Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.349274 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ttkgx" Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.484660 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 10:07:48 crc kubenswrapper[4693]: E1204 10:07:48.484971 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67" containerName="nova-cell0-conductor-db-sync" Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.484985 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67" containerName="nova-cell0-conductor-db-sync" Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.485197 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67" containerName="nova-cell0-conductor-db-sync" Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.485887 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.488350 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.488400 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-r828x" Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.494914 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.532249 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea5071f-1037-494c-b12f-ebddb5deb122-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2ea5071f-1037-494c-b12f-ebddb5deb122\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.532401 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea5071f-1037-494c-b12f-ebddb5deb122-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2ea5071f-1037-494c-b12f-ebddb5deb122\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.532468 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr7zp\" (UniqueName: \"kubernetes.io/projected/2ea5071f-1037-494c-b12f-ebddb5deb122-kube-api-access-hr7zp\") pod \"nova-cell0-conductor-0\" (UID: \"2ea5071f-1037-494c-b12f-ebddb5deb122\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.634399 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea5071f-1037-494c-b12f-ebddb5deb122-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2ea5071f-1037-494c-b12f-ebddb5deb122\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.634695 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea5071f-1037-494c-b12f-ebddb5deb122-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2ea5071f-1037-494c-b12f-ebddb5deb122\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.634763 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr7zp\" (UniqueName: \"kubernetes.io/projected/2ea5071f-1037-494c-b12f-ebddb5deb122-kube-api-access-hr7zp\") pod \"nova-cell0-conductor-0\" (UID: \"2ea5071f-1037-494c-b12f-ebddb5deb122\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.638354 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea5071f-1037-494c-b12f-ebddb5deb122-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2ea5071f-1037-494c-b12f-ebddb5deb122\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.639022 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea5071f-1037-494c-b12f-ebddb5deb122-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2ea5071f-1037-494c-b12f-ebddb5deb122\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.657038 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr7zp\" (UniqueName: \"kubernetes.io/projected/2ea5071f-1037-494c-b12f-ebddb5deb122-kube-api-access-hr7zp\") pod \"nova-cell0-conductor-0\" (UID: \"2ea5071f-1037-494c-b12f-ebddb5deb122\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.809127 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.818275 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.818599 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerName="ceilometer-central-agent" containerID="cri-o://3ccbaf0764a168a9e71dd4d812e12e6c058a93386431a786218e021d7c672146" gracePeriod=30 Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.818643 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerName="sg-core" containerID="cri-o://18db01b508dfa078bf03a51781b65f90b8bfd2b9074bc5562fbb8fbf7d37b67b" gracePeriod=30 Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.818781 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerName="ceilometer-notification-agent" containerID="cri-o://c7601927361f0bbfa04e814d3219bf8c4c5f1e8f0ae8a6374893a97440634bd6" gracePeriod=30 Dec 04 10:07:48 crc kubenswrapper[4693]: I1204 10:07:48.819183 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerName="proxy-httpd" containerID="cri-o://6459ee2ef2d5d3717a9c5eb8672cf6b91f5d6a440b1460b4495184431e6894af" gracePeriod=30 Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.304213 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.359235 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2ea5071f-1037-494c-b12f-ebddb5deb122","Type":"ContainerStarted","Data":"b051d8cefad1fbb4ddf5a8af34df0ed86c4efcb6d5f0a88c3cf2de9068469a50"} Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.363117 4693 generic.go:334] "Generic (PLEG): container finished" podID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerID="6459ee2ef2d5d3717a9c5eb8672cf6b91f5d6a440b1460b4495184431e6894af" exitCode=0 Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.363162 4693 generic.go:334] "Generic (PLEG): container finished" podID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerID="18db01b508dfa078bf03a51781b65f90b8bfd2b9074bc5562fbb8fbf7d37b67b" exitCode=2 Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.363176 4693 generic.go:334] "Generic (PLEG): container finished" podID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerID="c7601927361f0bbfa04e814d3219bf8c4c5f1e8f0ae8a6374893a97440634bd6" exitCode=0 Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.363219 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248","Type":"ContainerDied","Data":"6459ee2ef2d5d3717a9c5eb8672cf6b91f5d6a440b1460b4495184431e6894af"} Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.363283 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248","Type":"ContainerDied","Data":"18db01b508dfa078bf03a51781b65f90b8bfd2b9074bc5562fbb8fbf7d37b67b"} Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.363296 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248","Type":"ContainerDied","Data":"c7601927361f0bbfa04e814d3219bf8c4c5f1e8f0ae8a6374893a97440634bd6"} Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.831162 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lgh7t"] Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.834166 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lgh7t" Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.850149 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lgh7t"] Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.865583 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn6rp\" (UniqueName: \"kubernetes.io/projected/fadb85a0-e535-4a14-ba59-e76d370713b1-kube-api-access-gn6rp\") pod \"certified-operators-lgh7t\" (UID: \"fadb85a0-e535-4a14-ba59-e76d370713b1\") " pod="openshift-marketplace/certified-operators-lgh7t" Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.865725 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fadb85a0-e535-4a14-ba59-e76d370713b1-catalog-content\") pod \"certified-operators-lgh7t\" (UID: \"fadb85a0-e535-4a14-ba59-e76d370713b1\") " pod="openshift-marketplace/certified-operators-lgh7t" Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.865809 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fadb85a0-e535-4a14-ba59-e76d370713b1-utilities\") pod \"certified-operators-lgh7t\" (UID: \"fadb85a0-e535-4a14-ba59-e76d370713b1\") " pod="openshift-marketplace/certified-operators-lgh7t" Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.967777 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fadb85a0-e535-4a14-ba59-e76d370713b1-catalog-content\") pod \"certified-operators-lgh7t\" (UID: \"fadb85a0-e535-4a14-ba59-e76d370713b1\") " pod="openshift-marketplace/certified-operators-lgh7t" Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.967871 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fadb85a0-e535-4a14-ba59-e76d370713b1-utilities\") pod \"certified-operators-lgh7t\" (UID: \"fadb85a0-e535-4a14-ba59-e76d370713b1\") " pod="openshift-marketplace/certified-operators-lgh7t" Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.968010 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn6rp\" (UniqueName: \"kubernetes.io/projected/fadb85a0-e535-4a14-ba59-e76d370713b1-kube-api-access-gn6rp\") pod \"certified-operators-lgh7t\" (UID: \"fadb85a0-e535-4a14-ba59-e76d370713b1\") " pod="openshift-marketplace/certified-operators-lgh7t" Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.968273 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fadb85a0-e535-4a14-ba59-e76d370713b1-catalog-content\") pod \"certified-operators-lgh7t\" (UID: \"fadb85a0-e535-4a14-ba59-e76d370713b1\") " pod="openshift-marketplace/certified-operators-lgh7t" Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.968377 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fadb85a0-e535-4a14-ba59-e76d370713b1-utilities\") pod \"certified-operators-lgh7t\" (UID: \"fadb85a0-e535-4a14-ba59-e76d370713b1\") " pod="openshift-marketplace/certified-operators-lgh7t" Dec 04 10:07:49 crc kubenswrapper[4693]: I1204 10:07:49.984737 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn6rp\" (UniqueName: \"kubernetes.io/projected/fadb85a0-e535-4a14-ba59-e76d370713b1-kube-api-access-gn6rp\") pod \"certified-operators-lgh7t\" (UID: \"fadb85a0-e535-4a14-ba59-e76d370713b1\") " pod="openshift-marketplace/certified-operators-lgh7t" Dec 04 10:07:50 crc kubenswrapper[4693]: I1204 10:07:50.174378 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lgh7t" Dec 04 10:07:50 crc kubenswrapper[4693]: I1204 10:07:50.382573 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2ea5071f-1037-494c-b12f-ebddb5deb122","Type":"ContainerStarted","Data":"78458fd933329923dd1dac588efa1ed39b588f5d1044451c25376abca1e349ec"} Dec 04 10:07:50 crc kubenswrapper[4693]: I1204 10:07:50.383016 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 04 10:07:50 crc kubenswrapper[4693]: I1204 10:07:50.412799 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.412757678 podStartE2EDuration="2.412757678s" podCreationTimestamp="2025-12-04 10:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:07:50.404266264 +0000 UTC m=+1516.301860017" watchObservedRunningTime="2025-12-04 10:07:50.412757678 +0000 UTC m=+1516.310351431" Dec 04 10:07:50 crc kubenswrapper[4693]: I1204 10:07:50.718827 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lgh7t"] Dec 04 10:07:51 crc kubenswrapper[4693]: I1204 10:07:51.398597 4693 generic.go:334] "Generic (PLEG): container finished" podID="fadb85a0-e535-4a14-ba59-e76d370713b1" containerID="c77827b8a0b46bcd56cd8779c41f091e6b4c1e082180f0c3a5e2451dea2c590b" exitCode=0 Dec 04 10:07:51 crc kubenswrapper[4693]: I1204 10:07:51.398760 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgh7t" event={"ID":"fadb85a0-e535-4a14-ba59-e76d370713b1","Type":"ContainerDied","Data":"c77827b8a0b46bcd56cd8779c41f091e6b4c1e082180f0c3a5e2451dea2c590b"} Dec 04 10:07:51 crc kubenswrapper[4693]: I1204 10:07:51.400948 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgh7t" event={"ID":"fadb85a0-e535-4a14-ba59-e76d370713b1","Type":"ContainerStarted","Data":"04cc58bb8b46114b171085a0879296e16630d98db989a7d5e1f987b18c7901b3"} Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.411979 4693 generic.go:334] "Generic (PLEG): container finished" podID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerID="3ccbaf0764a168a9e71dd4d812e12e6c058a93386431a786218e021d7c672146" exitCode=0 Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.412268 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248","Type":"ContainerDied","Data":"3ccbaf0764a168a9e71dd4d812e12e6c058a93386431a786218e021d7c672146"} Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.550658 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.712718 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-combined-ca-bundle\") pod \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.712794 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-sg-core-conf-yaml\") pod \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.712855 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-log-httpd\") pod \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.713016 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvvdj\" (UniqueName: \"kubernetes.io/projected/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-kube-api-access-gvvdj\") pod \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.713041 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-config-data\") pod \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.713181 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-run-httpd\") pod \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.713361 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-scripts\") pod \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\" (UID: \"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248\") " Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.713925 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" (UID: "4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.713963 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" (UID: "4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.721353 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-scripts" (OuterVolumeSpecName: "scripts") pod "4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" (UID: "4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.724716 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-kube-api-access-gvvdj" (OuterVolumeSpecName: "kube-api-access-gvvdj") pod "4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" (UID: "4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248"). InnerVolumeSpecName "kube-api-access-gvvdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.746974 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" (UID: "4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.815564 4693 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.815803 4693 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.815885 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvvdj\" (UniqueName: \"kubernetes.io/projected/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-kube-api-access-gvvdj\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.815991 4693 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.816071 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.822286 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-config-data" (OuterVolumeSpecName: "config-data") pod "4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" (UID: "4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.824606 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" (UID: "4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.919300 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:52 crc kubenswrapper[4693]: I1204 10:07:52.919856 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.426696 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248","Type":"ContainerDied","Data":"4d9ab12cab265aa7922b58c0c142c4761077082230435a2f5076519483319eb9"} Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.426769 4693 scope.go:117] "RemoveContainer" containerID="6459ee2ef2d5d3717a9c5eb8672cf6b91f5d6a440b1460b4495184431e6894af" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.428278 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.465200 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.468170 4693 scope.go:117] "RemoveContainer" containerID="18db01b508dfa078bf03a51781b65f90b8bfd2b9074bc5562fbb8fbf7d37b67b" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.487805 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.496755 4693 scope.go:117] "RemoveContainer" containerID="c7601927361f0bbfa04e814d3219bf8c4c5f1e8f0ae8a6374893a97440634bd6" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.503029 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:07:53 crc kubenswrapper[4693]: E1204 10:07:53.503718 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerName="sg-core" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.503747 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerName="sg-core" Dec 04 10:07:53 crc kubenswrapper[4693]: E1204 10:07:53.503789 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerName="ceilometer-central-agent" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.503803 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerName="ceilometer-central-agent" Dec 04 10:07:53 crc kubenswrapper[4693]: E1204 10:07:53.503821 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerName="proxy-httpd" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.503829 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerName="proxy-httpd" Dec 04 10:07:53 crc kubenswrapper[4693]: E1204 10:07:53.503859 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerName="ceilometer-notification-agent" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.503869 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerName="ceilometer-notification-agent" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.504126 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerName="ceilometer-central-agent" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.504164 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerName="sg-core" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.504185 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerName="ceilometer-notification-agent" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.504198 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" containerName="proxy-httpd" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.506544 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.509962 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.519610 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.576409 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.585119 4693 scope.go:117] "RemoveContainer" containerID="3ccbaf0764a168a9e71dd4d812e12e6c058a93386431a786218e021d7c672146" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.648077 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-log-httpd\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.648149 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.648213 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-run-httpd\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.648255 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-config-data\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.648284 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vstf\" (UniqueName: \"kubernetes.io/projected/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-kube-api-access-5vstf\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.648301 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-scripts\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.648355 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.749682 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-log-httpd\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.749778 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.749862 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-run-httpd\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.749920 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-config-data\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.749962 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vstf\" (UniqueName: \"kubernetes.io/projected/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-kube-api-access-5vstf\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.749985 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-scripts\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.750043 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.750847 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-log-httpd\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.753753 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-run-httpd\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.757260 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-scripts\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.760759 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.760925 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-config-data\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.761405 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.773316 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vstf\" (UniqueName: \"kubernetes.io/projected/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-kube-api-access-5vstf\") pod \"ceilometer-0\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " pod="openstack/ceilometer-0" Dec 04 10:07:53 crc kubenswrapper[4693]: I1204 10:07:53.846020 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:07:54 crc kubenswrapper[4693]: I1204 10:07:54.209536 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sxjb6"] Dec 04 10:07:54 crc kubenswrapper[4693]: I1204 10:07:54.212622 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxjb6" Dec 04 10:07:54 crc kubenswrapper[4693]: I1204 10:07:54.252543 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxjb6"] Dec 04 10:07:54 crc kubenswrapper[4693]: I1204 10:07:54.321930 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:07:54 crc kubenswrapper[4693]: I1204 10:07:54.365992 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr8wc\" (UniqueName: \"kubernetes.io/projected/d364c298-a01c-42a4-bf43-83dfc48b54a7-kube-api-access-mr8wc\") pod \"redhat-marketplace-sxjb6\" (UID: \"d364c298-a01c-42a4-bf43-83dfc48b54a7\") " pod="openshift-marketplace/redhat-marketplace-sxjb6" Dec 04 10:07:54 crc kubenswrapper[4693]: I1204 10:07:54.366502 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d364c298-a01c-42a4-bf43-83dfc48b54a7-utilities\") pod \"redhat-marketplace-sxjb6\" (UID: \"d364c298-a01c-42a4-bf43-83dfc48b54a7\") " pod="openshift-marketplace/redhat-marketplace-sxjb6" Dec 04 10:07:54 crc kubenswrapper[4693]: I1204 10:07:54.366742 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d364c298-a01c-42a4-bf43-83dfc48b54a7-catalog-content\") pod \"redhat-marketplace-sxjb6\" (UID: \"d364c298-a01c-42a4-bf43-83dfc48b54a7\") " pod="openshift-marketplace/redhat-marketplace-sxjb6" Dec 04 10:07:54 crc kubenswrapper[4693]: I1204 10:07:54.441880 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e","Type":"ContainerStarted","Data":"84f0a047e2f91d2135153974ad4f9a451d61656437a01f1bf38987c396a53402"} Dec 04 10:07:54 crc kubenswrapper[4693]: I1204 10:07:54.445166 4693 generic.go:334] "Generic (PLEG): container finished" podID="fadb85a0-e535-4a14-ba59-e76d370713b1" containerID="3c48b8eca371e3ab44cd5cec654814ab8f3d5830db675b80b7637a7bf1333c2c" exitCode=0 Dec 04 10:07:54 crc kubenswrapper[4693]: I1204 10:07:54.445207 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgh7t" event={"ID":"fadb85a0-e535-4a14-ba59-e76d370713b1","Type":"ContainerDied","Data":"3c48b8eca371e3ab44cd5cec654814ab8f3d5830db675b80b7637a7bf1333c2c"} Dec 04 10:07:54 crc kubenswrapper[4693]: I1204 10:07:54.469145 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr8wc\" (UniqueName: \"kubernetes.io/projected/d364c298-a01c-42a4-bf43-83dfc48b54a7-kube-api-access-mr8wc\") pod \"redhat-marketplace-sxjb6\" (UID: \"d364c298-a01c-42a4-bf43-83dfc48b54a7\") " pod="openshift-marketplace/redhat-marketplace-sxjb6" Dec 04 10:07:54 crc kubenswrapper[4693]: I1204 10:07:54.469297 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d364c298-a01c-42a4-bf43-83dfc48b54a7-utilities\") pod \"redhat-marketplace-sxjb6\" (UID: \"d364c298-a01c-42a4-bf43-83dfc48b54a7\") " pod="openshift-marketplace/redhat-marketplace-sxjb6" Dec 04 10:07:54 crc kubenswrapper[4693]: I1204 10:07:54.469380 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d364c298-a01c-42a4-bf43-83dfc48b54a7-catalog-content\") pod \"redhat-marketplace-sxjb6\" (UID: \"d364c298-a01c-42a4-bf43-83dfc48b54a7\") " pod="openshift-marketplace/redhat-marketplace-sxjb6" Dec 04 10:07:54 crc kubenswrapper[4693]: I1204 10:07:54.470037 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d364c298-a01c-42a4-bf43-83dfc48b54a7-catalog-content\") pod \"redhat-marketplace-sxjb6\" (UID: \"d364c298-a01c-42a4-bf43-83dfc48b54a7\") " pod="openshift-marketplace/redhat-marketplace-sxjb6" Dec 04 10:07:54 crc kubenswrapper[4693]: I1204 10:07:54.470511 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d364c298-a01c-42a4-bf43-83dfc48b54a7-utilities\") pod \"redhat-marketplace-sxjb6\" (UID: \"d364c298-a01c-42a4-bf43-83dfc48b54a7\") " pod="openshift-marketplace/redhat-marketplace-sxjb6" Dec 04 10:07:54 crc kubenswrapper[4693]: I1204 10:07:54.487817 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248" path="/var/lib/kubelet/pods/4cd9bb4f-6b8f-4a3f-bc5a-a82cc88e8248/volumes" Dec 04 10:07:54 crc kubenswrapper[4693]: I1204 10:07:54.504413 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr8wc\" (UniqueName: \"kubernetes.io/projected/d364c298-a01c-42a4-bf43-83dfc48b54a7-kube-api-access-mr8wc\") pod \"redhat-marketplace-sxjb6\" (UID: \"d364c298-a01c-42a4-bf43-83dfc48b54a7\") " pod="openshift-marketplace/redhat-marketplace-sxjb6" Dec 04 10:07:54 crc kubenswrapper[4693]: I1204 10:07:54.549703 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxjb6" Dec 04 10:07:55 crc kubenswrapper[4693]: W1204 10:07:55.053512 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd364c298_a01c_42a4_bf43_83dfc48b54a7.slice/crio-8f16f43a18a54ebe99069ba7059cddd88de91d995d17b2d66e2d5d399b60157b WatchSource:0}: Error finding container 8f16f43a18a54ebe99069ba7059cddd88de91d995d17b2d66e2d5d399b60157b: Status 404 returned error can't find the container with id 8f16f43a18a54ebe99069ba7059cddd88de91d995d17b2d66e2d5d399b60157b Dec 04 10:07:55 crc kubenswrapper[4693]: I1204 10:07:55.053808 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxjb6"] Dec 04 10:07:55 crc kubenswrapper[4693]: I1204 10:07:55.465905 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgh7t" event={"ID":"fadb85a0-e535-4a14-ba59-e76d370713b1","Type":"ContainerStarted","Data":"e724374d1c0d84d711d483bb8c53264ac4d7c417fb9c296e18b6e84cb424751f"} Dec 04 10:07:55 crc kubenswrapper[4693]: I1204 10:07:55.471555 4693 generic.go:334] "Generic (PLEG): container finished" podID="d364c298-a01c-42a4-bf43-83dfc48b54a7" containerID="c31564f1ea1b7a210dda17317f9304524b925df727b23e1eba0a40e456a8c28d" exitCode=0 Dec 04 10:07:55 crc kubenswrapper[4693]: I1204 10:07:55.471670 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxjb6" event={"ID":"d364c298-a01c-42a4-bf43-83dfc48b54a7","Type":"ContainerDied","Data":"c31564f1ea1b7a210dda17317f9304524b925df727b23e1eba0a40e456a8c28d"} Dec 04 10:07:55 crc kubenswrapper[4693]: I1204 10:07:55.471743 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxjb6" event={"ID":"d364c298-a01c-42a4-bf43-83dfc48b54a7","Type":"ContainerStarted","Data":"8f16f43a18a54ebe99069ba7059cddd88de91d995d17b2d66e2d5d399b60157b"} Dec 04 10:07:55 crc kubenswrapper[4693]: I1204 10:07:55.494912 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e","Type":"ContainerStarted","Data":"d8c9bb19dee4349f66194c0a6debad613c409466ee83a3698de4dec1b13225fc"} Dec 04 10:07:55 crc kubenswrapper[4693]: I1204 10:07:55.497988 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lgh7t" podStartSLOduration=3.025868152 podStartE2EDuration="6.497964252s" podCreationTimestamp="2025-12-04 10:07:49 +0000 UTC" firstStartedPulling="2025-12-04 10:07:51.403498479 +0000 UTC m=+1517.301092232" lastFinishedPulling="2025-12-04 10:07:54.875594579 +0000 UTC m=+1520.773188332" observedRunningTime="2025-12-04 10:07:55.487161923 +0000 UTC m=+1521.384755686" watchObservedRunningTime="2025-12-04 10:07:55.497964252 +0000 UTC m=+1521.395558005" Dec 04 10:07:56 crc kubenswrapper[4693]: I1204 10:07:56.510218 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e","Type":"ContainerStarted","Data":"7e79d4b6d6bec855023e6f520ca1e051fc560bb74f8958199f0dde21ab4b9468"} Dec 04 10:07:56 crc kubenswrapper[4693]: I1204 10:07:56.511062 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e","Type":"ContainerStarted","Data":"f233924ccc9fbd9a122ef22bd55fbc01087fdfc468230439b79408240d470d59"} Dec 04 10:07:56 crc kubenswrapper[4693]: I1204 10:07:56.512698 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxjb6" event={"ID":"d364c298-a01c-42a4-bf43-83dfc48b54a7","Type":"ContainerStarted","Data":"7901083fe2eadfc3514cf20fa8746e63d270dbcdb5452146a7d3cda7778aadde"} Dec 04 10:07:56 crc kubenswrapper[4693]: I1204 10:07:56.797302 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 04 10:07:57 crc kubenswrapper[4693]: I1204 10:07:57.530109 4693 generic.go:334] "Generic (PLEG): container finished" podID="d364c298-a01c-42a4-bf43-83dfc48b54a7" containerID="7901083fe2eadfc3514cf20fa8746e63d270dbcdb5452146a7d3cda7778aadde" exitCode=0 Dec 04 10:07:57 crc kubenswrapper[4693]: I1204 10:07:57.530500 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxjb6" event={"ID":"d364c298-a01c-42a4-bf43-83dfc48b54a7","Type":"ContainerDied","Data":"7901083fe2eadfc3514cf20fa8746e63d270dbcdb5452146a7d3cda7778aadde"} Dec 04 10:07:58 crc kubenswrapper[4693]: I1204 10:07:58.851920 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 04 10:07:59 crc kubenswrapper[4693]: I1204 10:07:59.877392 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-xfxvd"] Dec 04 10:07:59 crc kubenswrapper[4693]: I1204 10:07:59.879171 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xfxvd" Dec 04 10:07:59 crc kubenswrapper[4693]: I1204 10:07:59.882512 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 04 10:07:59 crc kubenswrapper[4693]: I1204 10:07:59.884183 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 04 10:07:59 crc kubenswrapper[4693]: I1204 10:07:59.907425 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xfxvd"] Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.024703 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/713c9761-3dbd-4889-9678-9acee2bd6635-scripts\") pod \"nova-cell0-cell-mapping-xfxvd\" (UID: \"713c9761-3dbd-4889-9678-9acee2bd6635\") " pod="openstack/nova-cell0-cell-mapping-xfxvd" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.025397 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/713c9761-3dbd-4889-9678-9acee2bd6635-config-data\") pod \"nova-cell0-cell-mapping-xfxvd\" (UID: \"713c9761-3dbd-4889-9678-9acee2bd6635\") " pod="openstack/nova-cell0-cell-mapping-xfxvd" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.025597 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713c9761-3dbd-4889-9678-9acee2bd6635-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xfxvd\" (UID: \"713c9761-3dbd-4889-9678-9acee2bd6635\") " pod="openstack/nova-cell0-cell-mapping-xfxvd" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.025634 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sxqg\" (UniqueName: \"kubernetes.io/projected/713c9761-3dbd-4889-9678-9acee2bd6635-kube-api-access-6sxqg\") pod \"nova-cell0-cell-mapping-xfxvd\" (UID: \"713c9761-3dbd-4889-9678-9acee2bd6635\") " pod="openstack/nova-cell0-cell-mapping-xfxvd" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.131962 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/713c9761-3dbd-4889-9678-9acee2bd6635-scripts\") pod \"nova-cell0-cell-mapping-xfxvd\" (UID: \"713c9761-3dbd-4889-9678-9acee2bd6635\") " pod="openstack/nova-cell0-cell-mapping-xfxvd" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.133318 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/713c9761-3dbd-4889-9678-9acee2bd6635-config-data\") pod \"nova-cell0-cell-mapping-xfxvd\" (UID: \"713c9761-3dbd-4889-9678-9acee2bd6635\") " pod="openstack/nova-cell0-cell-mapping-xfxvd" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.133763 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713c9761-3dbd-4889-9678-9acee2bd6635-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xfxvd\" (UID: \"713c9761-3dbd-4889-9678-9acee2bd6635\") " pod="openstack/nova-cell0-cell-mapping-xfxvd" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.133835 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sxqg\" (UniqueName: \"kubernetes.io/projected/713c9761-3dbd-4889-9678-9acee2bd6635-kube-api-access-6sxqg\") pod \"nova-cell0-cell-mapping-xfxvd\" (UID: \"713c9761-3dbd-4889-9678-9acee2bd6635\") " pod="openstack/nova-cell0-cell-mapping-xfxvd" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.161971 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/713c9761-3dbd-4889-9678-9acee2bd6635-scripts\") pod \"nova-cell0-cell-mapping-xfxvd\" (UID: \"713c9761-3dbd-4889-9678-9acee2bd6635\") " pod="openstack/nova-cell0-cell-mapping-xfxvd" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.175424 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713c9761-3dbd-4889-9678-9acee2bd6635-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xfxvd\" (UID: \"713c9761-3dbd-4889-9678-9acee2bd6635\") " pod="openstack/nova-cell0-cell-mapping-xfxvd" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.178151 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lgh7t" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.185722 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sxqg\" (UniqueName: \"kubernetes.io/projected/713c9761-3dbd-4889-9678-9acee2bd6635-kube-api-access-6sxqg\") pod \"nova-cell0-cell-mapping-xfxvd\" (UID: \"713c9761-3dbd-4889-9678-9acee2bd6635\") " pod="openstack/nova-cell0-cell-mapping-xfxvd" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.185903 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lgh7t" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.245376 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/713c9761-3dbd-4889-9678-9acee2bd6635-config-data\") pod \"nova-cell0-cell-mapping-xfxvd\" (UID: \"713c9761-3dbd-4889-9678-9acee2bd6635\") " pod="openstack/nova-cell0-cell-mapping-xfxvd" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.255465 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.262616 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.291520 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.356340 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.373814 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.376198 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lgh7t" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.376655 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.394420 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.395256 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.396433 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.396541 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.408206 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.408363 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.425779 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbnxx\" (UniqueName: \"kubernetes.io/projected/56eb0da4-6210-4898-b5f8-4060701e29c9-kube-api-access-xbnxx\") pod \"nova-api-0\" (UID: \"56eb0da4-6210-4898-b5f8-4060701e29c9\") " pod="openstack/nova-api-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.425854 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56eb0da4-6210-4898-b5f8-4060701e29c9-logs\") pod \"nova-api-0\" (UID: \"56eb0da4-6210-4898-b5f8-4060701e29c9\") " pod="openstack/nova-api-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.425922 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56eb0da4-6210-4898-b5f8-4060701e29c9-config-data\") pod \"nova-api-0\" (UID: \"56eb0da4-6210-4898-b5f8-4060701e29c9\") " pod="openstack/nova-api-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.425944 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56eb0da4-6210-4898-b5f8-4060701e29c9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56eb0da4-6210-4898-b5f8-4060701e29c9\") " pod="openstack/nova-api-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.436953 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.455291 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.455486 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.473214 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.527763 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-config-data\") pod \"nova-metadata-0\" (UID: \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\") " pod="openstack/nova-metadata-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.527841 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc89b\" (UniqueName: \"kubernetes.io/projected/7821c94d-7490-46ce-904f-ffc0e90b9b8b-kube-api-access-kc89b\") pod \"nova-scheduler-0\" (UID: \"7821c94d-7490-46ce-904f-ffc0e90b9b8b\") " pod="openstack/nova-scheduler-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.527906 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbnxx\" (UniqueName: \"kubernetes.io/projected/56eb0da4-6210-4898-b5f8-4060701e29c9-kube-api-access-xbnxx\") pod \"nova-api-0\" (UID: \"56eb0da4-6210-4898-b5f8-4060701e29c9\") " pod="openstack/nova-api-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.527955 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7821c94d-7490-46ce-904f-ffc0e90b9b8b-config-data\") pod \"nova-scheduler-0\" (UID: \"7821c94d-7490-46ce-904f-ffc0e90b9b8b\") " pod="openstack/nova-scheduler-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.527992 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6kvm\" (UniqueName: \"kubernetes.io/projected/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-kube-api-access-x6kvm\") pod \"nova-metadata-0\" (UID: \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\") " pod="openstack/nova-metadata-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.528031 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56eb0da4-6210-4898-b5f8-4060701e29c9-logs\") pod \"nova-api-0\" (UID: \"56eb0da4-6210-4898-b5f8-4060701e29c9\") " pod="openstack/nova-api-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.528110 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\") " pod="openstack/nova-metadata-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.528159 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55cc9d0-bc14-45af-978e-d72660ebcda0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d55cc9d0-bc14-45af-978e-d72660ebcda0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.528202 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7821c94d-7490-46ce-904f-ffc0e90b9b8b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7821c94d-7490-46ce-904f-ffc0e90b9b8b\") " pod="openstack/nova-scheduler-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.528229 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-logs\") pod \"nova-metadata-0\" (UID: \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\") " pod="openstack/nova-metadata-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.528273 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56eb0da4-6210-4898-b5f8-4060701e29c9-config-data\") pod \"nova-api-0\" (UID: \"56eb0da4-6210-4898-b5f8-4060701e29c9\") " pod="openstack/nova-api-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.528299 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56eb0da4-6210-4898-b5f8-4060701e29c9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56eb0da4-6210-4898-b5f8-4060701e29c9\") " pod="openstack/nova-api-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.528319 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d55cc9d0-bc14-45af-978e-d72660ebcda0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d55cc9d0-bc14-45af-978e-d72660ebcda0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.528365 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf8pb\" (UniqueName: \"kubernetes.io/projected/d55cc9d0-bc14-45af-978e-d72660ebcda0-kube-api-access-lf8pb\") pod \"nova-cell1-novncproxy-0\" (UID: \"d55cc9d0-bc14-45af-978e-d72660ebcda0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.529242 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-858594bc89-xx8tz"] Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.529865 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56eb0da4-6210-4898-b5f8-4060701e29c9-logs\") pod \"nova-api-0\" (UID: \"56eb0da4-6210-4898-b5f8-4060701e29c9\") " pod="openstack/nova-api-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.540851 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.540866 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xfxvd" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.549465 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56eb0da4-6210-4898-b5f8-4060701e29c9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56eb0da4-6210-4898-b5f8-4060701e29c9\") " pod="openstack/nova-api-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.556012 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56eb0da4-6210-4898-b5f8-4060701e29c9-config-data\") pod \"nova-api-0\" (UID: \"56eb0da4-6210-4898-b5f8-4060701e29c9\") " pod="openstack/nova-api-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.571863 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-858594bc89-xx8tz"] Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.595978 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbnxx\" (UniqueName: \"kubernetes.io/projected/56eb0da4-6210-4898-b5f8-4060701e29c9-kube-api-access-xbnxx\") pod \"nova-api-0\" (UID: \"56eb0da4-6210-4898-b5f8-4060701e29c9\") " pod="openstack/nova-api-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.631035 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e","Type":"ContainerStarted","Data":"6c51bc5d682a3515d8b07da5460199c9ff98fde234a191e7fa161d0804f5782f"} Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.631568 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.633711 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\") " pod="openstack/nova-metadata-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.633768 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-ovsdbserver-nb\") pod \"dnsmasq-dns-858594bc89-xx8tz\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.633859 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55cc9d0-bc14-45af-978e-d72660ebcda0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d55cc9d0-bc14-45af-978e-d72660ebcda0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.633905 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7821c94d-7490-46ce-904f-ffc0e90b9b8b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7821c94d-7490-46ce-904f-ffc0e90b9b8b\") " pod="openstack/nova-scheduler-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.633937 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-logs\") pod \"nova-metadata-0\" (UID: \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\") " pod="openstack/nova-metadata-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.634002 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d55cc9d0-bc14-45af-978e-d72660ebcda0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d55cc9d0-bc14-45af-978e-d72660ebcda0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.634020 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf8pb\" (UniqueName: \"kubernetes.io/projected/d55cc9d0-bc14-45af-978e-d72660ebcda0-kube-api-access-lf8pb\") pod \"nova-cell1-novncproxy-0\" (UID: \"d55cc9d0-bc14-45af-978e-d72660ebcda0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.634075 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggwws\" (UniqueName: \"kubernetes.io/projected/22bd0022-0cff-49e8-96c0-ef6334718552-kube-api-access-ggwws\") pod \"dnsmasq-dns-858594bc89-xx8tz\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.634112 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-dns-swift-storage-0\") pod \"dnsmasq-dns-858594bc89-xx8tz\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.634162 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-config-data\") pod \"nova-metadata-0\" (UID: \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\") " pod="openstack/nova-metadata-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.634185 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc89b\" (UniqueName: \"kubernetes.io/projected/7821c94d-7490-46ce-904f-ffc0e90b9b8b-kube-api-access-kc89b\") pod \"nova-scheduler-0\" (UID: \"7821c94d-7490-46ce-904f-ffc0e90b9b8b\") " pod="openstack/nova-scheduler-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.634234 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-config\") pod \"dnsmasq-dns-858594bc89-xx8tz\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.634279 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-dns-svc\") pod \"dnsmasq-dns-858594bc89-xx8tz\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.634303 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-ovsdbserver-sb\") pod \"dnsmasq-dns-858594bc89-xx8tz\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.634322 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7821c94d-7490-46ce-904f-ffc0e90b9b8b-config-data\") pod \"nova-scheduler-0\" (UID: \"7821c94d-7490-46ce-904f-ffc0e90b9b8b\") " pod="openstack/nova-scheduler-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.634376 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6kvm\" (UniqueName: \"kubernetes.io/projected/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-kube-api-access-x6kvm\") pod \"nova-metadata-0\" (UID: \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\") " pod="openstack/nova-metadata-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.642159 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\") " pod="openstack/nova-metadata-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.642553 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-logs\") pod \"nova-metadata-0\" (UID: \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\") " pod="openstack/nova-metadata-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.645601 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d55cc9d0-bc14-45af-978e-d72660ebcda0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d55cc9d0-bc14-45af-978e-d72660ebcda0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.654355 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-config-data\") pod \"nova-metadata-0\" (UID: \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\") " pod="openstack/nova-metadata-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.660110 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7821c94d-7490-46ce-904f-ffc0e90b9b8b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7821c94d-7490-46ce-904f-ffc0e90b9b8b\") " pod="openstack/nova-scheduler-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.665465 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.2970055609999998 podStartE2EDuration="7.665439799s" podCreationTimestamp="2025-12-04 10:07:53 +0000 UTC" firstStartedPulling="2025-12-04 10:07:54.33412546 +0000 UTC m=+1520.231719213" lastFinishedPulling="2025-12-04 10:07:59.702559698 +0000 UTC m=+1525.600153451" observedRunningTime="2025-12-04 10:08:00.660504652 +0000 UTC m=+1526.558098415" watchObservedRunningTime="2025-12-04 10:08:00.665439799 +0000 UTC m=+1526.563033552" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.676107 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55cc9d0-bc14-45af-978e-d72660ebcda0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d55cc9d0-bc14-45af-978e-d72660ebcda0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.669441 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc89b\" (UniqueName: \"kubernetes.io/projected/7821c94d-7490-46ce-904f-ffc0e90b9b8b-kube-api-access-kc89b\") pod \"nova-scheduler-0\" (UID: \"7821c94d-7490-46ce-904f-ffc0e90b9b8b\") " pod="openstack/nova-scheduler-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.666144 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7821c94d-7490-46ce-904f-ffc0e90b9b8b-config-data\") pod \"nova-scheduler-0\" (UID: \"7821c94d-7490-46ce-904f-ffc0e90b9b8b\") " pod="openstack/nova-scheduler-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.685795 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf8pb\" (UniqueName: \"kubernetes.io/projected/d55cc9d0-bc14-45af-978e-d72660ebcda0-kube-api-access-lf8pb\") pod \"nova-cell1-novncproxy-0\" (UID: \"d55cc9d0-bc14-45af-978e-d72660ebcda0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.689074 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6kvm\" (UniqueName: \"kubernetes.io/projected/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-kube-api-access-x6kvm\") pod \"nova-metadata-0\" (UID: \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\") " pod="openstack/nova-metadata-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.739665 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-ovsdbserver-sb\") pod \"dnsmasq-dns-858594bc89-xx8tz\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.739794 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-ovsdbserver-nb\") pod \"dnsmasq-dns-858594bc89-xx8tz\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.739968 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggwws\" (UniqueName: \"kubernetes.io/projected/22bd0022-0cff-49e8-96c0-ef6334718552-kube-api-access-ggwws\") pod \"dnsmasq-dns-858594bc89-xx8tz\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.740001 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-dns-swift-storage-0\") pod \"dnsmasq-dns-858594bc89-xx8tz\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.740059 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-config\") pod \"dnsmasq-dns-858594bc89-xx8tz\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.740102 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-dns-svc\") pod \"dnsmasq-dns-858594bc89-xx8tz\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.741261 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-dns-svc\") pod \"dnsmasq-dns-858594bc89-xx8tz\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.741972 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-ovsdbserver-sb\") pod \"dnsmasq-dns-858594bc89-xx8tz\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.743612 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-ovsdbserver-nb\") pod \"dnsmasq-dns-858594bc89-xx8tz\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.744447 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-config\") pod \"dnsmasq-dns-858594bc89-xx8tz\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.744586 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-dns-swift-storage-0\") pod \"dnsmasq-dns-858594bc89-xx8tz\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.788079 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lgh7t" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.788312 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggwws\" (UniqueName: \"kubernetes.io/projected/22bd0022-0cff-49e8-96c0-ef6334718552-kube-api-access-ggwws\") pod \"dnsmasq-dns-858594bc89-xx8tz\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.803414 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.830828 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.854559 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.859905 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:00 crc kubenswrapper[4693]: I1204 10:08:00.881127 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.265412 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xfxvd"] Dec 04 10:08:01 crc kubenswrapper[4693]: W1204 10:08:01.326089 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod713c9761_3dbd_4889_9678_9acee2bd6635.slice/crio-ceaca4d9f543e0026a58184b668420e84628867e57f517950f2218f3d4924a61 WatchSource:0}: Error finding container ceaca4d9f543e0026a58184b668420e84628867e57f517950f2218f3d4924a61: Status 404 returned error can't find the container with id ceaca4d9f543e0026a58184b668420e84628867e57f517950f2218f3d4924a61 Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.399598 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-x2n7g"] Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.401232 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-x2n7g" Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.404953 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.405591 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.412612 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lgh7t"] Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.430984 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-x2n7g"] Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.478149 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2fwv\" (UniqueName: \"kubernetes.io/projected/5e81cf43-7573-4691-878b-eeae474d75be-kube-api-access-r2fwv\") pod \"nova-cell1-conductor-db-sync-x2n7g\" (UID: \"5e81cf43-7573-4691-878b-eeae474d75be\") " pod="openstack/nova-cell1-conductor-db-sync-x2n7g" Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.478235 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e81cf43-7573-4691-878b-eeae474d75be-config-data\") pod \"nova-cell1-conductor-db-sync-x2n7g\" (UID: \"5e81cf43-7573-4691-878b-eeae474d75be\") " pod="openstack/nova-cell1-conductor-db-sync-x2n7g" Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.478399 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e81cf43-7573-4691-878b-eeae474d75be-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-x2n7g\" (UID: \"5e81cf43-7573-4691-878b-eeae474d75be\") " pod="openstack/nova-cell1-conductor-db-sync-x2n7g" Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.478508 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e81cf43-7573-4691-878b-eeae474d75be-scripts\") pod \"nova-cell1-conductor-db-sync-x2n7g\" (UID: \"5e81cf43-7573-4691-878b-eeae474d75be\") " pod="openstack/nova-cell1-conductor-db-sync-x2n7g" Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.581793 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e81cf43-7573-4691-878b-eeae474d75be-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-x2n7g\" (UID: \"5e81cf43-7573-4691-878b-eeae474d75be\") " pod="openstack/nova-cell1-conductor-db-sync-x2n7g" Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.582385 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e81cf43-7573-4691-878b-eeae474d75be-scripts\") pod \"nova-cell1-conductor-db-sync-x2n7g\" (UID: \"5e81cf43-7573-4691-878b-eeae474d75be\") " pod="openstack/nova-cell1-conductor-db-sync-x2n7g" Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.583359 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2fwv\" (UniqueName: \"kubernetes.io/projected/5e81cf43-7573-4691-878b-eeae474d75be-kube-api-access-r2fwv\") pod \"nova-cell1-conductor-db-sync-x2n7g\" (UID: \"5e81cf43-7573-4691-878b-eeae474d75be\") " pod="openstack/nova-cell1-conductor-db-sync-x2n7g" Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.583430 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e81cf43-7573-4691-878b-eeae474d75be-config-data\") pod \"nova-cell1-conductor-db-sync-x2n7g\" (UID: \"5e81cf43-7573-4691-878b-eeae474d75be\") " pod="openstack/nova-cell1-conductor-db-sync-x2n7g" Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.595292 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e81cf43-7573-4691-878b-eeae474d75be-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-x2n7g\" (UID: \"5e81cf43-7573-4691-878b-eeae474d75be\") " pod="openstack/nova-cell1-conductor-db-sync-x2n7g" Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.621875 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.623482 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e81cf43-7573-4691-878b-eeae474d75be-scripts\") pod \"nova-cell1-conductor-db-sync-x2n7g\" (UID: \"5e81cf43-7573-4691-878b-eeae474d75be\") " pod="openstack/nova-cell1-conductor-db-sync-x2n7g" Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.625130 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e81cf43-7573-4691-878b-eeae474d75be-config-data\") pod \"nova-cell1-conductor-db-sync-x2n7g\" (UID: \"5e81cf43-7573-4691-878b-eeae474d75be\") " pod="openstack/nova-cell1-conductor-db-sync-x2n7g" Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.629634 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2fwv\" (UniqueName: \"kubernetes.io/projected/5e81cf43-7573-4691-878b-eeae474d75be-kube-api-access-r2fwv\") pod \"nova-cell1-conductor-db-sync-x2n7g\" (UID: \"5e81cf43-7573-4691-878b-eeae474d75be\") " pod="openstack/nova-cell1-conductor-db-sync-x2n7g" Dec 04 10:08:01 crc kubenswrapper[4693]: W1204 10:08:01.632297 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c95629b_b509_4fe1_9e8c_e4e51c58aab6.slice/crio-8787be2ec459dc0c5ad70e17cd778953dc28ae3eaa8983187f51db83a0942a89 WatchSource:0}: Error finding container 8787be2ec459dc0c5ad70e17cd778953dc28ae3eaa8983187f51db83a0942a89: Status 404 returned error can't find the container with id 8787be2ec459dc0c5ad70e17cd778953dc28ae3eaa8983187f51db83a0942a89 Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.660954 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.688220 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxjb6" event={"ID":"d364c298-a01c-42a4-bf43-83dfc48b54a7","Type":"ContainerStarted","Data":"1ee83f2a997426d36d43a63319f77ff342106b959cd15cf7e8e1580945e622d3"} Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.691060 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xfxvd" event={"ID":"713c9761-3dbd-4889-9678-9acee2bd6635","Type":"ContainerStarted","Data":"ceaca4d9f543e0026a58184b668420e84628867e57f517950f2218f3d4924a61"} Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.700468 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6c95629b-b509-4fe1-9e8c-e4e51c58aab6","Type":"ContainerStarted","Data":"8787be2ec459dc0c5ad70e17cd778953dc28ae3eaa8983187f51db83a0942a89"} Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.722511 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sxjb6" podStartSLOduration=3.219218873 podStartE2EDuration="7.722492551s" podCreationTimestamp="2025-12-04 10:07:54 +0000 UTC" firstStartedPulling="2025-12-04 10:07:55.482619618 +0000 UTC m=+1521.380213371" lastFinishedPulling="2025-12-04 10:07:59.985893296 +0000 UTC m=+1525.883487049" observedRunningTime="2025-12-04 10:08:01.712696381 +0000 UTC m=+1527.610290134" watchObservedRunningTime="2025-12-04 10:08:01.722492551 +0000 UTC m=+1527.620086304" Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.754569 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-x2n7g" Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.836682 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:08:01 crc kubenswrapper[4693]: I1204 10:08:01.943615 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:08:02 crc kubenswrapper[4693]: I1204 10:08:02.360665 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-x2n7g"] Dec 04 10:08:02 crc kubenswrapper[4693]: I1204 10:08:02.584572 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-858594bc89-xx8tz"] Dec 04 10:08:02 crc kubenswrapper[4693]: I1204 10:08:02.708030 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7821c94d-7490-46ce-904f-ffc0e90b9b8b","Type":"ContainerStarted","Data":"398f2e501c18887bf17717f26540922f164c414976a14277a73dc88aa85b1894"} Dec 04 10:08:02 crc kubenswrapper[4693]: I1204 10:08:02.715159 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d55cc9d0-bc14-45af-978e-d72660ebcda0","Type":"ContainerStarted","Data":"b7d8efbddb1d963f3ae560af0d76724a8c8a0941b7a8bf6b9d3ec7f7ffac4127"} Dec 04 10:08:02 crc kubenswrapper[4693]: I1204 10:08:02.719669 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-x2n7g" event={"ID":"5e81cf43-7573-4691-878b-eeae474d75be","Type":"ContainerStarted","Data":"6dcfeccf963f436bc75a219e663bc7ab46d6307f1e58f4f8b9bd5d197ffe4025"} Dec 04 10:08:02 crc kubenswrapper[4693]: I1204 10:08:02.722100 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56eb0da4-6210-4898-b5f8-4060701e29c9","Type":"ContainerStarted","Data":"d7b199d2ede5a4820e44ec24141bdcec02c2d8c4b9d644585d182c83efd50459"} Dec 04 10:08:02 crc kubenswrapper[4693]: I1204 10:08:02.729256 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-858594bc89-xx8tz" event={"ID":"22bd0022-0cff-49e8-96c0-ef6334718552","Type":"ContainerStarted","Data":"20d8d6b0328fee94b791f01a150ef52396dd44b8c5ba893b00152728c8acb166"} Dec 04 10:08:02 crc kubenswrapper[4693]: I1204 10:08:02.729467 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lgh7t" podUID="fadb85a0-e535-4a14-ba59-e76d370713b1" containerName="registry-server" containerID="cri-o://e724374d1c0d84d711d483bb8c53264ac4d7c417fb9c296e18b6e84cb424751f" gracePeriod=2 Dec 04 10:08:03 crc kubenswrapper[4693]: I1204 10:08:03.793263 4693 generic.go:334] "Generic (PLEG): container finished" podID="22bd0022-0cff-49e8-96c0-ef6334718552" containerID="ba46fc675691f9114426b5cf58b98e7ff1de034a08c56cea4e5c1f5de6910c3e" exitCode=0 Dec 04 10:08:03 crc kubenswrapper[4693]: I1204 10:08:03.794362 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-858594bc89-xx8tz" event={"ID":"22bd0022-0cff-49e8-96c0-ef6334718552","Type":"ContainerDied","Data":"ba46fc675691f9114426b5cf58b98e7ff1de034a08c56cea4e5c1f5de6910c3e"} Dec 04 10:08:03 crc kubenswrapper[4693]: I1204 10:08:03.799112 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xfxvd" event={"ID":"713c9761-3dbd-4889-9678-9acee2bd6635","Type":"ContainerStarted","Data":"8fcbc0ab777db633d734638b6dfc6f50f372f62d7296cb8fd8788f20b86b027f"} Dec 04 10:08:03 crc kubenswrapper[4693]: I1204 10:08:03.818730 4693 generic.go:334] "Generic (PLEG): container finished" podID="fadb85a0-e535-4a14-ba59-e76d370713b1" containerID="e724374d1c0d84d711d483bb8c53264ac4d7c417fb9c296e18b6e84cb424751f" exitCode=0 Dec 04 10:08:03 crc kubenswrapper[4693]: I1204 10:08:03.818853 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgh7t" event={"ID":"fadb85a0-e535-4a14-ba59-e76d370713b1","Type":"ContainerDied","Data":"e724374d1c0d84d711d483bb8c53264ac4d7c417fb9c296e18b6e84cb424751f"} Dec 04 10:08:03 crc kubenswrapper[4693]: I1204 10:08:03.821110 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-x2n7g" event={"ID":"5e81cf43-7573-4691-878b-eeae474d75be","Type":"ContainerStarted","Data":"a8bbc45cb758b8cc0c6d98a2d90f19d61fd7af8daf5de08f295908f76256a884"} Dec 04 10:08:03 crc kubenswrapper[4693]: I1204 10:08:03.847542 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-xfxvd" podStartSLOduration=4.847516797 podStartE2EDuration="4.847516797s" podCreationTimestamp="2025-12-04 10:07:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:08:03.83676673 +0000 UTC m=+1529.734360483" watchObservedRunningTime="2025-12-04 10:08:03.847516797 +0000 UTC m=+1529.745110550" Dec 04 10:08:03 crc kubenswrapper[4693]: I1204 10:08:03.861975 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-x2n7g" podStartSLOduration=2.861955416 podStartE2EDuration="2.861955416s" podCreationTimestamp="2025-12-04 10:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:08:03.859852857 +0000 UTC m=+1529.757446610" watchObservedRunningTime="2025-12-04 10:08:03.861955416 +0000 UTC m=+1529.759549169" Dec 04 10:08:03 crc kubenswrapper[4693]: I1204 10:08:03.996195 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lgh7t" Dec 04 10:08:04 crc kubenswrapper[4693]: I1204 10:08:04.071848 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fadb85a0-e535-4a14-ba59-e76d370713b1-catalog-content\") pod \"fadb85a0-e535-4a14-ba59-e76d370713b1\" (UID: \"fadb85a0-e535-4a14-ba59-e76d370713b1\") " Dec 04 10:08:04 crc kubenswrapper[4693]: I1204 10:08:04.072020 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn6rp\" (UniqueName: \"kubernetes.io/projected/fadb85a0-e535-4a14-ba59-e76d370713b1-kube-api-access-gn6rp\") pod \"fadb85a0-e535-4a14-ba59-e76d370713b1\" (UID: \"fadb85a0-e535-4a14-ba59-e76d370713b1\") " Dec 04 10:08:04 crc kubenswrapper[4693]: I1204 10:08:04.072048 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fadb85a0-e535-4a14-ba59-e76d370713b1-utilities\") pod \"fadb85a0-e535-4a14-ba59-e76d370713b1\" (UID: \"fadb85a0-e535-4a14-ba59-e76d370713b1\") " Dec 04 10:08:04 crc kubenswrapper[4693]: I1204 10:08:04.073538 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fadb85a0-e535-4a14-ba59-e76d370713b1-utilities" (OuterVolumeSpecName: "utilities") pod "fadb85a0-e535-4a14-ba59-e76d370713b1" (UID: "fadb85a0-e535-4a14-ba59-e76d370713b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:08:04 crc kubenswrapper[4693]: I1204 10:08:04.105109 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fadb85a0-e535-4a14-ba59-e76d370713b1-kube-api-access-gn6rp" (OuterVolumeSpecName: "kube-api-access-gn6rp") pod "fadb85a0-e535-4a14-ba59-e76d370713b1" (UID: "fadb85a0-e535-4a14-ba59-e76d370713b1"). InnerVolumeSpecName "kube-api-access-gn6rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:08:04 crc kubenswrapper[4693]: I1204 10:08:04.136079 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fadb85a0-e535-4a14-ba59-e76d370713b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fadb85a0-e535-4a14-ba59-e76d370713b1" (UID: "fadb85a0-e535-4a14-ba59-e76d370713b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:08:04 crc kubenswrapper[4693]: I1204 10:08:04.178190 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fadb85a0-e535-4a14-ba59-e76d370713b1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:04 crc kubenswrapper[4693]: I1204 10:08:04.178612 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn6rp\" (UniqueName: \"kubernetes.io/projected/fadb85a0-e535-4a14-ba59-e76d370713b1-kube-api-access-gn6rp\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:04 crc kubenswrapper[4693]: I1204 10:08:04.178694 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fadb85a0-e535-4a14-ba59-e76d370713b1-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:04 crc kubenswrapper[4693]: I1204 10:08:04.308250 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:08:04 crc kubenswrapper[4693]: I1204 10:08:04.326826 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:08:04 crc kubenswrapper[4693]: I1204 10:08:04.550857 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sxjb6" Dec 04 10:08:04 crc kubenswrapper[4693]: I1204 10:08:04.554769 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sxjb6" Dec 04 10:08:04 crc kubenswrapper[4693]: I1204 10:08:04.859052 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lgh7t" Dec 04 10:08:04 crc kubenswrapper[4693]: I1204 10:08:04.859566 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgh7t" event={"ID":"fadb85a0-e535-4a14-ba59-e76d370713b1","Type":"ContainerDied","Data":"04cc58bb8b46114b171085a0879296e16630d98db989a7d5e1f987b18c7901b3"} Dec 04 10:08:04 crc kubenswrapper[4693]: I1204 10:08:04.859598 4693 scope.go:117] "RemoveContainer" containerID="e724374d1c0d84d711d483bb8c53264ac4d7c417fb9c296e18b6e84cb424751f" Dec 04 10:08:04 crc kubenswrapper[4693]: I1204 10:08:04.914566 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lgh7t"] Dec 04 10:08:04 crc kubenswrapper[4693]: I1204 10:08:04.926218 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lgh7t"] Dec 04 10:08:05 crc kubenswrapper[4693]: I1204 10:08:05.681031 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-sxjb6" podUID="d364c298-a01c-42a4-bf43-83dfc48b54a7" containerName="registry-server" probeResult="failure" output=< Dec 04 10:08:05 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Dec 04 10:08:05 crc kubenswrapper[4693]: > Dec 04 10:08:06 crc kubenswrapper[4693]: I1204 10:08:06.349029 4693 scope.go:117] "RemoveContainer" containerID="3c48b8eca371e3ab44cd5cec654814ab8f3d5830db675b80b7637a7bf1333c2c" Dec 04 10:08:06 crc kubenswrapper[4693]: I1204 10:08:06.435952 4693 scope.go:117] "RemoveContainer" containerID="c77827b8a0b46bcd56cd8779c41f091e6b4c1e082180f0c3a5e2451dea2c590b" Dec 04 10:08:06 crc kubenswrapper[4693]: I1204 10:08:06.492271 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fadb85a0-e535-4a14-ba59-e76d370713b1" path="/var/lib/kubelet/pods/fadb85a0-e535-4a14-ba59-e76d370713b1/volumes" Dec 04 10:08:06 crc kubenswrapper[4693]: I1204 10:08:06.891756 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6c95629b-b509-4fe1-9e8c-e4e51c58aab6","Type":"ContainerStarted","Data":"e942b1167dc6e3390081a36ab6c903f66013525c91df0ed0d5491c7e28b9c755"} Dec 04 10:08:06 crc kubenswrapper[4693]: I1204 10:08:06.905159 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-858594bc89-xx8tz" event={"ID":"22bd0022-0cff-49e8-96c0-ef6334718552","Type":"ContainerStarted","Data":"386ce87ef53f449cb8ab5a574e316340291000e5bc74d3c48282243b9de714db"} Dec 04 10:08:06 crc kubenswrapper[4693]: I1204 10:08:06.906707 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:06 crc kubenswrapper[4693]: I1204 10:08:06.939969 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-858594bc89-xx8tz" podStartSLOduration=6.939946078 podStartE2EDuration="6.939946078s" podCreationTimestamp="2025-12-04 10:08:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:08:06.931470224 +0000 UTC m=+1532.829063977" watchObservedRunningTime="2025-12-04 10:08:06.939946078 +0000 UTC m=+1532.837539831" Dec 04 10:08:07 crc kubenswrapper[4693]: I1204 10:08:07.917864 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7821c94d-7490-46ce-904f-ffc0e90b9b8b","Type":"ContainerStarted","Data":"294fd94cef5f6c8c0a6f1c9cce663e860de3c6f031162ba6bca61fb7015e9045"} Dec 04 10:08:07 crc kubenswrapper[4693]: I1204 10:08:07.922298 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d55cc9d0-bc14-45af-978e-d72660ebcda0","Type":"ContainerStarted","Data":"0954176a47ed9aadcdaa9a1b3c9ee4d75440ddc0c641f5299d984ce7687c347f"} Dec 04 10:08:07 crc kubenswrapper[4693]: I1204 10:08:07.922467 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="d55cc9d0-bc14-45af-978e-d72660ebcda0" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0954176a47ed9aadcdaa9a1b3c9ee4d75440ddc0c641f5299d984ce7687c347f" gracePeriod=30 Dec 04 10:08:07 crc kubenswrapper[4693]: I1204 10:08:07.933141 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56eb0da4-6210-4898-b5f8-4060701e29c9","Type":"ContainerStarted","Data":"7bf4699ea5fed8730eb2453f5ae73cbffa4a548d76da0063caf1c25f7a6a9907"} Dec 04 10:08:07 crc kubenswrapper[4693]: I1204 10:08:07.933596 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56eb0da4-6210-4898-b5f8-4060701e29c9","Type":"ContainerStarted","Data":"286ab105107676d196f72cb11643b59723a166203a3bb27e88cf6416bfd3d959"} Dec 04 10:08:07 crc kubenswrapper[4693]: I1204 10:08:07.939895 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6c95629b-b509-4fe1-9e8c-e4e51c58aab6","Type":"ContainerStarted","Data":"e1f92ec4a61a28713e9061f9a4b2f58ec9a7ce78651c25bb3be216512d2d186f"} Dec 04 10:08:07 crc kubenswrapper[4693]: I1204 10:08:07.939897 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6c95629b-b509-4fe1-9e8c-e4e51c58aab6" containerName="nova-metadata-log" containerID="cri-o://e942b1167dc6e3390081a36ab6c903f66013525c91df0ed0d5491c7e28b9c755" gracePeriod=30 Dec 04 10:08:07 crc kubenswrapper[4693]: I1204 10:08:07.939945 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6c95629b-b509-4fe1-9e8c-e4e51c58aab6" containerName="nova-metadata-metadata" containerID="cri-o://e1f92ec4a61a28713e9061f9a4b2f58ec9a7ce78651c25bb3be216512d2d186f" gracePeriod=30 Dec 04 10:08:07 crc kubenswrapper[4693]: I1204 10:08:07.945899 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.346392933 podStartE2EDuration="7.945878318s" podCreationTimestamp="2025-12-04 10:08:00 +0000 UTC" firstStartedPulling="2025-12-04 10:08:01.843755541 +0000 UTC m=+1527.741349294" lastFinishedPulling="2025-12-04 10:08:06.443240926 +0000 UTC m=+1532.340834679" observedRunningTime="2025-12-04 10:08:07.938223876 +0000 UTC m=+1533.835817629" watchObservedRunningTime="2025-12-04 10:08:07.945878318 +0000 UTC m=+1533.843472071" Dec 04 10:08:07 crc kubenswrapper[4693]: I1204 10:08:07.964751 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.182195017 podStartE2EDuration="7.964731109s" podCreationTimestamp="2025-12-04 10:08:00 +0000 UTC" firstStartedPulling="2025-12-04 10:08:01.656046795 +0000 UTC m=+1527.553640548" lastFinishedPulling="2025-12-04 10:08:06.438582887 +0000 UTC m=+1532.336176640" observedRunningTime="2025-12-04 10:08:07.962387424 +0000 UTC m=+1533.859981177" watchObservedRunningTime="2025-12-04 10:08:07.964731109 +0000 UTC m=+1533.862324862" Dec 04 10:08:07 crc kubenswrapper[4693]: I1204 10:08:07.992900 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.503702827 podStartE2EDuration="7.992879436s" podCreationTimestamp="2025-12-04 10:08:00 +0000 UTC" firstStartedPulling="2025-12-04 10:08:01.94726094 +0000 UTC m=+1527.844854683" lastFinishedPulling="2025-12-04 10:08:06.436437539 +0000 UTC m=+1532.334031292" observedRunningTime="2025-12-04 10:08:07.980533125 +0000 UTC m=+1533.878126888" watchObservedRunningTime="2025-12-04 10:08:07.992879436 +0000 UTC m=+1533.890473179" Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.005989 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.197483591 podStartE2EDuration="8.005970019s" podCreationTimestamp="2025-12-04 10:08:00 +0000 UTC" firstStartedPulling="2025-12-04 10:08:01.634660575 +0000 UTC m=+1527.532254328" lastFinishedPulling="2025-12-04 10:08:06.443147003 +0000 UTC m=+1532.340740756" observedRunningTime="2025-12-04 10:08:07.999037687 +0000 UTC m=+1533.896631440" watchObservedRunningTime="2025-12-04 10:08:08.005970019 +0000 UTC m=+1533.903563772" Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.637686 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.818945 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-config-data\") pod \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\" (UID: \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\") " Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.819541 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6kvm\" (UniqueName: \"kubernetes.io/projected/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-kube-api-access-x6kvm\") pod \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\" (UID: \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\") " Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.819615 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-logs\") pod \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\" (UID: \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\") " Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.819639 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-combined-ca-bundle\") pod \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\" (UID: \"6c95629b-b509-4fe1-9e8c-e4e51c58aab6\") " Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.823738 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-logs" (OuterVolumeSpecName: "logs") pod "6c95629b-b509-4fe1-9e8c-e4e51c58aab6" (UID: "6c95629b-b509-4fe1-9e8c-e4e51c58aab6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.830721 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-kube-api-access-x6kvm" (OuterVolumeSpecName: "kube-api-access-x6kvm") pod "6c95629b-b509-4fe1-9e8c-e4e51c58aab6" (UID: "6c95629b-b509-4fe1-9e8c-e4e51c58aab6"). InnerVolumeSpecName "kube-api-access-x6kvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.863281 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c95629b-b509-4fe1-9e8c-e4e51c58aab6" (UID: "6c95629b-b509-4fe1-9e8c-e4e51c58aab6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.867225 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-config-data" (OuterVolumeSpecName: "config-data") pod "6c95629b-b509-4fe1-9e8c-e4e51c58aab6" (UID: "6c95629b-b509-4fe1-9e8c-e4e51c58aab6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.922651 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.922705 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6kvm\" (UniqueName: \"kubernetes.io/projected/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-kube-api-access-x6kvm\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.922723 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.922735 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c95629b-b509-4fe1-9e8c-e4e51c58aab6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.968999 4693 generic.go:334] "Generic (PLEG): container finished" podID="6c95629b-b509-4fe1-9e8c-e4e51c58aab6" containerID="e1f92ec4a61a28713e9061f9a4b2f58ec9a7ce78651c25bb3be216512d2d186f" exitCode=0 Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.969074 4693 generic.go:334] "Generic (PLEG): container finished" podID="6c95629b-b509-4fe1-9e8c-e4e51c58aab6" containerID="e942b1167dc6e3390081a36ab6c903f66013525c91df0ed0d5491c7e28b9c755" exitCode=143 Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.969120 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6c95629b-b509-4fe1-9e8c-e4e51c58aab6","Type":"ContainerDied","Data":"e1f92ec4a61a28713e9061f9a4b2f58ec9a7ce78651c25bb3be216512d2d186f"} Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.969212 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6c95629b-b509-4fe1-9e8c-e4e51c58aab6","Type":"ContainerDied","Data":"e942b1167dc6e3390081a36ab6c903f66013525c91df0ed0d5491c7e28b9c755"} Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.969223 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6c95629b-b509-4fe1-9e8c-e4e51c58aab6","Type":"ContainerDied","Data":"8787be2ec459dc0c5ad70e17cd778953dc28ae3eaa8983187f51db83a0942a89"} Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.969259 4693 scope.go:117] "RemoveContainer" containerID="e1f92ec4a61a28713e9061f9a4b2f58ec9a7ce78651c25bb3be216512d2d186f" Dec 04 10:08:08 crc kubenswrapper[4693]: I1204 10:08:08.970537 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.001434 4693 scope.go:117] "RemoveContainer" containerID="e942b1167dc6e3390081a36ab6c903f66013525c91df0ed0d5491c7e28b9c755" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.041140 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.042726 4693 scope.go:117] "RemoveContainer" containerID="e1f92ec4a61a28713e9061f9a4b2f58ec9a7ce78651c25bb3be216512d2d186f" Dec 04 10:08:09 crc kubenswrapper[4693]: E1204 10:08:09.043410 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1f92ec4a61a28713e9061f9a4b2f58ec9a7ce78651c25bb3be216512d2d186f\": container with ID starting with e1f92ec4a61a28713e9061f9a4b2f58ec9a7ce78651c25bb3be216512d2d186f not found: ID does not exist" containerID="e1f92ec4a61a28713e9061f9a4b2f58ec9a7ce78651c25bb3be216512d2d186f" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.043446 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1f92ec4a61a28713e9061f9a4b2f58ec9a7ce78651c25bb3be216512d2d186f"} err="failed to get container status \"e1f92ec4a61a28713e9061f9a4b2f58ec9a7ce78651c25bb3be216512d2d186f\": rpc error: code = NotFound desc = could not find container \"e1f92ec4a61a28713e9061f9a4b2f58ec9a7ce78651c25bb3be216512d2d186f\": container with ID starting with e1f92ec4a61a28713e9061f9a4b2f58ec9a7ce78651c25bb3be216512d2d186f not found: ID does not exist" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.043479 4693 scope.go:117] "RemoveContainer" containerID="e942b1167dc6e3390081a36ab6c903f66013525c91df0ed0d5491c7e28b9c755" Dec 04 10:08:09 crc kubenswrapper[4693]: E1204 10:08:09.044032 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e942b1167dc6e3390081a36ab6c903f66013525c91df0ed0d5491c7e28b9c755\": container with ID starting with e942b1167dc6e3390081a36ab6c903f66013525c91df0ed0d5491c7e28b9c755 not found: ID does not exist" containerID="e942b1167dc6e3390081a36ab6c903f66013525c91df0ed0d5491c7e28b9c755" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.044076 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e942b1167dc6e3390081a36ab6c903f66013525c91df0ed0d5491c7e28b9c755"} err="failed to get container status \"e942b1167dc6e3390081a36ab6c903f66013525c91df0ed0d5491c7e28b9c755\": rpc error: code = NotFound desc = could not find container \"e942b1167dc6e3390081a36ab6c903f66013525c91df0ed0d5491c7e28b9c755\": container with ID starting with e942b1167dc6e3390081a36ab6c903f66013525c91df0ed0d5491c7e28b9c755 not found: ID does not exist" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.044100 4693 scope.go:117] "RemoveContainer" containerID="e1f92ec4a61a28713e9061f9a4b2f58ec9a7ce78651c25bb3be216512d2d186f" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.044564 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1f92ec4a61a28713e9061f9a4b2f58ec9a7ce78651c25bb3be216512d2d186f"} err="failed to get container status \"e1f92ec4a61a28713e9061f9a4b2f58ec9a7ce78651c25bb3be216512d2d186f\": rpc error: code = NotFound desc = could not find container \"e1f92ec4a61a28713e9061f9a4b2f58ec9a7ce78651c25bb3be216512d2d186f\": container with ID starting with e1f92ec4a61a28713e9061f9a4b2f58ec9a7ce78651c25bb3be216512d2d186f not found: ID does not exist" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.044585 4693 scope.go:117] "RemoveContainer" containerID="e942b1167dc6e3390081a36ab6c903f66013525c91df0ed0d5491c7e28b9c755" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.044819 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e942b1167dc6e3390081a36ab6c903f66013525c91df0ed0d5491c7e28b9c755"} err="failed to get container status \"e942b1167dc6e3390081a36ab6c903f66013525c91df0ed0d5491c7e28b9c755\": rpc error: code = NotFound desc = could not find container \"e942b1167dc6e3390081a36ab6c903f66013525c91df0ed0d5491c7e28b9c755\": container with ID starting with e942b1167dc6e3390081a36ab6c903f66013525c91df0ed0d5491c7e28b9c755 not found: ID does not exist" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.057701 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.074839 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:08:09 crc kubenswrapper[4693]: E1204 10:08:09.075801 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c95629b-b509-4fe1-9e8c-e4e51c58aab6" containerName="nova-metadata-metadata" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.075836 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c95629b-b509-4fe1-9e8c-e4e51c58aab6" containerName="nova-metadata-metadata" Dec 04 10:08:09 crc kubenswrapper[4693]: E1204 10:08:09.075856 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c95629b-b509-4fe1-9e8c-e4e51c58aab6" containerName="nova-metadata-log" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.075930 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c95629b-b509-4fe1-9e8c-e4e51c58aab6" containerName="nova-metadata-log" Dec 04 10:08:09 crc kubenswrapper[4693]: E1204 10:08:09.076013 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fadb85a0-e535-4a14-ba59-e76d370713b1" containerName="extract-content" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.076027 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="fadb85a0-e535-4a14-ba59-e76d370713b1" containerName="extract-content" Dec 04 10:08:09 crc kubenswrapper[4693]: E1204 10:08:09.076053 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fadb85a0-e535-4a14-ba59-e76d370713b1" containerName="registry-server" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.076061 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="fadb85a0-e535-4a14-ba59-e76d370713b1" containerName="registry-server" Dec 04 10:08:09 crc kubenswrapper[4693]: E1204 10:08:09.076082 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fadb85a0-e535-4a14-ba59-e76d370713b1" containerName="extract-utilities" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.076092 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="fadb85a0-e535-4a14-ba59-e76d370713b1" containerName="extract-utilities" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.076443 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c95629b-b509-4fe1-9e8c-e4e51c58aab6" containerName="nova-metadata-metadata" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.076495 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c95629b-b509-4fe1-9e8c-e4e51c58aab6" containerName="nova-metadata-log" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.076512 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="fadb85a0-e535-4a14-ba59-e76d370713b1" containerName="registry-server" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.078138 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.085842 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.086195 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.097216 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.166314 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37b4cd53-5335-452f-b7ad-34ababf00ab1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " pod="openstack/nova-metadata-0" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.168728 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z75v2\" (UniqueName: \"kubernetes.io/projected/37b4cd53-5335-452f-b7ad-34ababf00ab1-kube-api-access-z75v2\") pod \"nova-metadata-0\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " pod="openstack/nova-metadata-0" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.168933 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37b4cd53-5335-452f-b7ad-34ababf00ab1-config-data\") pod \"nova-metadata-0\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " pod="openstack/nova-metadata-0" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.169228 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37b4cd53-5335-452f-b7ad-34ababf00ab1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " pod="openstack/nova-metadata-0" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.169526 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37b4cd53-5335-452f-b7ad-34ababf00ab1-logs\") pod \"nova-metadata-0\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " pod="openstack/nova-metadata-0" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.272137 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37b4cd53-5335-452f-b7ad-34ababf00ab1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " pod="openstack/nova-metadata-0" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.272670 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z75v2\" (UniqueName: \"kubernetes.io/projected/37b4cd53-5335-452f-b7ad-34ababf00ab1-kube-api-access-z75v2\") pod \"nova-metadata-0\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " pod="openstack/nova-metadata-0" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.272822 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37b4cd53-5335-452f-b7ad-34ababf00ab1-config-data\") pod \"nova-metadata-0\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " pod="openstack/nova-metadata-0" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.272979 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37b4cd53-5335-452f-b7ad-34ababf00ab1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " pod="openstack/nova-metadata-0" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.273130 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37b4cd53-5335-452f-b7ad-34ababf00ab1-logs\") pod \"nova-metadata-0\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " pod="openstack/nova-metadata-0" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.273808 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37b4cd53-5335-452f-b7ad-34ababf00ab1-logs\") pod \"nova-metadata-0\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " pod="openstack/nova-metadata-0" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.276295 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37b4cd53-5335-452f-b7ad-34ababf00ab1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " pod="openstack/nova-metadata-0" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.279469 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37b4cd53-5335-452f-b7ad-34ababf00ab1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " pod="openstack/nova-metadata-0" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.290254 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z75v2\" (UniqueName: \"kubernetes.io/projected/37b4cd53-5335-452f-b7ad-34ababf00ab1-kube-api-access-z75v2\") pod \"nova-metadata-0\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " pod="openstack/nova-metadata-0" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.290682 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37b4cd53-5335-452f-b7ad-34ababf00ab1-config-data\") pod \"nova-metadata-0\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " pod="openstack/nova-metadata-0" Dec 04 10:08:09 crc kubenswrapper[4693]: I1204 10:08:09.468656 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:08:10 crc kubenswrapper[4693]: W1204 10:08:10.009576 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37b4cd53_5335_452f_b7ad_34ababf00ab1.slice/crio-6c2b7be26634475e317115c22b65a31f8872dd0259f0db282a6cbac75830b0c5 WatchSource:0}: Error finding container 6c2b7be26634475e317115c22b65a31f8872dd0259f0db282a6cbac75830b0c5: Status 404 returned error can't find the container with id 6c2b7be26634475e317115c22b65a31f8872dd0259f0db282a6cbac75830b0c5 Dec 04 10:08:10 crc kubenswrapper[4693]: I1204 10:08:10.018799 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:08:10 crc kubenswrapper[4693]: I1204 10:08:10.510431 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c95629b-b509-4fe1-9e8c-e4e51c58aab6" path="/var/lib/kubelet/pods/6c95629b-b509-4fe1-9e8c-e4e51c58aab6/volumes" Dec 04 10:08:10 crc kubenswrapper[4693]: I1204 10:08:10.805427 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:08:10 crc kubenswrapper[4693]: I1204 10:08:10.805525 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:08:10 crc kubenswrapper[4693]: I1204 10:08:10.831812 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 10:08:10 crc kubenswrapper[4693]: I1204 10:08:10.831900 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 10:08:10 crc kubenswrapper[4693]: I1204 10:08:10.860858 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:10 crc kubenswrapper[4693]: I1204 10:08:10.868127 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 10:08:10 crc kubenswrapper[4693]: I1204 10:08:10.883859 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:10 crc kubenswrapper[4693]: I1204 10:08:10.974347 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d6d889f-p9sjj"] Dec 04 10:08:10 crc kubenswrapper[4693]: I1204 10:08:10.974651 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" podUID="9b95841c-9b4d-4f37-88f2-2f94dd1b57b0" containerName="dnsmasq-dns" containerID="cri-o://c058f80eb5adf4299d6a8581297037c3ced8024132b42dca5de7d92308e78470" gracePeriod=10 Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.008519 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37b4cd53-5335-452f-b7ad-34ababf00ab1","Type":"ContainerStarted","Data":"e969b70b4b298392c6ab7a67e63d4da605a40b61eb3822ace00b85236d3b9cb7"} Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.009227 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37b4cd53-5335-452f-b7ad-34ababf00ab1","Type":"ContainerStarted","Data":"dc5f8cfd75ebdb744b7cb73fa8c7bd2a3c0f70fddbb2b0beb1530ac95f1f04d8"} Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.009252 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37b4cd53-5335-452f-b7ad-34ababf00ab1","Type":"ContainerStarted","Data":"6c2b7be26634475e317115c22b65a31f8872dd0259f0db282a6cbac75830b0c5"} Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.044745 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.044721577 podStartE2EDuration="2.044721577s" podCreationTimestamp="2025-12-04 10:08:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:08:11.038704491 +0000 UTC m=+1536.936298244" watchObservedRunningTime="2025-12-04 10:08:11.044721577 +0000 UTC m=+1536.942315330" Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.065631 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.568100 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.641053 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-dns-swift-storage-0\") pod \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.641150 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-ovsdbserver-nb\") pod \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.641239 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-config\") pod \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.641311 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-dns-svc\") pod \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.641449 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-ovsdbserver-sb\") pod \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.641511 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvk5r\" (UniqueName: \"kubernetes.io/projected/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-kube-api-access-fvk5r\") pod \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\" (UID: \"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0\") " Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.655662 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-kube-api-access-fvk5r" (OuterVolumeSpecName: "kube-api-access-fvk5r") pod "9b95841c-9b4d-4f37-88f2-2f94dd1b57b0" (UID: "9b95841c-9b4d-4f37-88f2-2f94dd1b57b0"). InnerVolumeSpecName "kube-api-access-fvk5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.712878 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9b95841c-9b4d-4f37-88f2-2f94dd1b57b0" (UID: "9b95841c-9b4d-4f37-88f2-2f94dd1b57b0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.719971 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9b95841c-9b4d-4f37-88f2-2f94dd1b57b0" (UID: "9b95841c-9b4d-4f37-88f2-2f94dd1b57b0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.745035 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvk5r\" (UniqueName: \"kubernetes.io/projected/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-kube-api-access-fvk5r\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.745081 4693 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.745092 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.754243 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9b95841c-9b4d-4f37-88f2-2f94dd1b57b0" (UID: "9b95841c-9b4d-4f37-88f2-2f94dd1b57b0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.758004 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9b95841c-9b4d-4f37-88f2-2f94dd1b57b0" (UID: "9b95841c-9b4d-4f37-88f2-2f94dd1b57b0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.769492 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-config" (OuterVolumeSpecName: "config") pod "9b95841c-9b4d-4f37-88f2-2f94dd1b57b0" (UID: "9b95841c-9b4d-4f37-88f2-2f94dd1b57b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.847781 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.848215 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.848225 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.890905 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="56eb0da4-6210-4898-b5f8-4060701e29c9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:08:11 crc kubenswrapper[4693]: I1204 10:08:11.891262 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="56eb0da4-6210-4898-b5f8-4060701e29c9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:08:12 crc kubenswrapper[4693]: I1204 10:08:12.023398 4693 generic.go:334] "Generic (PLEG): container finished" podID="5e81cf43-7573-4691-878b-eeae474d75be" containerID="a8bbc45cb758b8cc0c6d98a2d90f19d61fd7af8daf5de08f295908f76256a884" exitCode=0 Dec 04 10:08:12 crc kubenswrapper[4693]: I1204 10:08:12.023736 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-x2n7g" event={"ID":"5e81cf43-7573-4691-878b-eeae474d75be","Type":"ContainerDied","Data":"a8bbc45cb758b8cc0c6d98a2d90f19d61fd7af8daf5de08f295908f76256a884"} Dec 04 10:08:12 crc kubenswrapper[4693]: I1204 10:08:12.028812 4693 generic.go:334] "Generic (PLEG): container finished" podID="9b95841c-9b4d-4f37-88f2-2f94dd1b57b0" containerID="c058f80eb5adf4299d6a8581297037c3ced8024132b42dca5de7d92308e78470" exitCode=0 Dec 04 10:08:12 crc kubenswrapper[4693]: I1204 10:08:12.029080 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" event={"ID":"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0","Type":"ContainerDied","Data":"c058f80eb5adf4299d6a8581297037c3ced8024132b42dca5de7d92308e78470"} Dec 04 10:08:12 crc kubenswrapper[4693]: I1204 10:08:12.029266 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" Dec 04 10:08:12 crc kubenswrapper[4693]: I1204 10:08:12.052399 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d6d889f-p9sjj" event={"ID":"9b95841c-9b4d-4f37-88f2-2f94dd1b57b0","Type":"ContainerDied","Data":"09eae557d0b7250ee9025512f376923331c53d44030bc4bca3aa8312f0181d97"} Dec 04 10:08:12 crc kubenswrapper[4693]: I1204 10:08:12.052470 4693 scope.go:117] "RemoveContainer" containerID="c058f80eb5adf4299d6a8581297037c3ced8024132b42dca5de7d92308e78470" Dec 04 10:08:12 crc kubenswrapper[4693]: I1204 10:08:12.099952 4693 scope.go:117] "RemoveContainer" containerID="ce2830f799dd33c4f53816b70d7c5d5ff87a96c4c429c3ba35f2908a9256933f" Dec 04 10:08:12 crc kubenswrapper[4693]: I1204 10:08:12.104222 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d6d889f-p9sjj"] Dec 04 10:08:12 crc kubenswrapper[4693]: I1204 10:08:12.120819 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d6d889f-p9sjj"] Dec 04 10:08:12 crc kubenswrapper[4693]: I1204 10:08:12.134879 4693 scope.go:117] "RemoveContainer" containerID="c058f80eb5adf4299d6a8581297037c3ced8024132b42dca5de7d92308e78470" Dec 04 10:08:12 crc kubenswrapper[4693]: E1204 10:08:12.136656 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c058f80eb5adf4299d6a8581297037c3ced8024132b42dca5de7d92308e78470\": container with ID starting with c058f80eb5adf4299d6a8581297037c3ced8024132b42dca5de7d92308e78470 not found: ID does not exist" containerID="c058f80eb5adf4299d6a8581297037c3ced8024132b42dca5de7d92308e78470" Dec 04 10:08:12 crc kubenswrapper[4693]: I1204 10:08:12.136758 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c058f80eb5adf4299d6a8581297037c3ced8024132b42dca5de7d92308e78470"} err="failed to get container status \"c058f80eb5adf4299d6a8581297037c3ced8024132b42dca5de7d92308e78470\": rpc error: code = NotFound desc = could not find container \"c058f80eb5adf4299d6a8581297037c3ced8024132b42dca5de7d92308e78470\": container with ID starting with c058f80eb5adf4299d6a8581297037c3ced8024132b42dca5de7d92308e78470 not found: ID does not exist" Dec 04 10:08:12 crc kubenswrapper[4693]: I1204 10:08:12.136789 4693 scope.go:117] "RemoveContainer" containerID="ce2830f799dd33c4f53816b70d7c5d5ff87a96c4c429c3ba35f2908a9256933f" Dec 04 10:08:12 crc kubenswrapper[4693]: E1204 10:08:12.137392 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce2830f799dd33c4f53816b70d7c5d5ff87a96c4c429c3ba35f2908a9256933f\": container with ID starting with ce2830f799dd33c4f53816b70d7c5d5ff87a96c4c429c3ba35f2908a9256933f not found: ID does not exist" containerID="ce2830f799dd33c4f53816b70d7c5d5ff87a96c4c429c3ba35f2908a9256933f" Dec 04 10:08:12 crc kubenswrapper[4693]: I1204 10:08:12.137464 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce2830f799dd33c4f53816b70d7c5d5ff87a96c4c429c3ba35f2908a9256933f"} err="failed to get container status \"ce2830f799dd33c4f53816b70d7c5d5ff87a96c4c429c3ba35f2908a9256933f\": rpc error: code = NotFound desc = could not find container \"ce2830f799dd33c4f53816b70d7c5d5ff87a96c4c429c3ba35f2908a9256933f\": container with ID starting with ce2830f799dd33c4f53816b70d7c5d5ff87a96c4c429c3ba35f2908a9256933f not found: ID does not exist" Dec 04 10:08:12 crc kubenswrapper[4693]: I1204 10:08:12.478583 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b95841c-9b4d-4f37-88f2-2f94dd1b57b0" path="/var/lib/kubelet/pods/9b95841c-9b4d-4f37-88f2-2f94dd1b57b0/volumes" Dec 04 10:08:13 crc kubenswrapper[4693]: I1204 10:08:13.047370 4693 generic.go:334] "Generic (PLEG): container finished" podID="713c9761-3dbd-4889-9678-9acee2bd6635" containerID="8fcbc0ab777db633d734638b6dfc6f50f372f62d7296cb8fd8788f20b86b027f" exitCode=0 Dec 04 10:08:13 crc kubenswrapper[4693]: I1204 10:08:13.047493 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xfxvd" event={"ID":"713c9761-3dbd-4889-9678-9acee2bd6635","Type":"ContainerDied","Data":"8fcbc0ab777db633d734638b6dfc6f50f372f62d7296cb8fd8788f20b86b027f"} Dec 04 10:08:13 crc kubenswrapper[4693]: I1204 10:08:13.548452 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-x2n7g" Dec 04 10:08:13 crc kubenswrapper[4693]: I1204 10:08:13.593861 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2fwv\" (UniqueName: \"kubernetes.io/projected/5e81cf43-7573-4691-878b-eeae474d75be-kube-api-access-r2fwv\") pod \"5e81cf43-7573-4691-878b-eeae474d75be\" (UID: \"5e81cf43-7573-4691-878b-eeae474d75be\") " Dec 04 10:08:13 crc kubenswrapper[4693]: I1204 10:08:13.594275 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e81cf43-7573-4691-878b-eeae474d75be-config-data\") pod \"5e81cf43-7573-4691-878b-eeae474d75be\" (UID: \"5e81cf43-7573-4691-878b-eeae474d75be\") " Dec 04 10:08:13 crc kubenswrapper[4693]: I1204 10:08:13.594320 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e81cf43-7573-4691-878b-eeae474d75be-combined-ca-bundle\") pod \"5e81cf43-7573-4691-878b-eeae474d75be\" (UID: \"5e81cf43-7573-4691-878b-eeae474d75be\") " Dec 04 10:08:13 crc kubenswrapper[4693]: I1204 10:08:13.594482 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e81cf43-7573-4691-878b-eeae474d75be-scripts\") pod \"5e81cf43-7573-4691-878b-eeae474d75be\" (UID: \"5e81cf43-7573-4691-878b-eeae474d75be\") " Dec 04 10:08:13 crc kubenswrapper[4693]: I1204 10:08:13.600834 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e81cf43-7573-4691-878b-eeae474d75be-scripts" (OuterVolumeSpecName: "scripts") pod "5e81cf43-7573-4691-878b-eeae474d75be" (UID: "5e81cf43-7573-4691-878b-eeae474d75be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:13 crc kubenswrapper[4693]: I1204 10:08:13.600968 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e81cf43-7573-4691-878b-eeae474d75be-kube-api-access-r2fwv" (OuterVolumeSpecName: "kube-api-access-r2fwv") pod "5e81cf43-7573-4691-878b-eeae474d75be" (UID: "5e81cf43-7573-4691-878b-eeae474d75be"). InnerVolumeSpecName "kube-api-access-r2fwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:08:13 crc kubenswrapper[4693]: I1204 10:08:13.625797 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e81cf43-7573-4691-878b-eeae474d75be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e81cf43-7573-4691-878b-eeae474d75be" (UID: "5e81cf43-7573-4691-878b-eeae474d75be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:13 crc kubenswrapper[4693]: I1204 10:08:13.628850 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e81cf43-7573-4691-878b-eeae474d75be-config-data" (OuterVolumeSpecName: "config-data") pod "5e81cf43-7573-4691-878b-eeae474d75be" (UID: "5e81cf43-7573-4691-878b-eeae474d75be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:13 crc kubenswrapper[4693]: I1204 10:08:13.696744 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e81cf43-7573-4691-878b-eeae474d75be-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:13 crc kubenswrapper[4693]: I1204 10:08:13.696779 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2fwv\" (UniqueName: \"kubernetes.io/projected/5e81cf43-7573-4691-878b-eeae474d75be-kube-api-access-r2fwv\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:13 crc kubenswrapper[4693]: I1204 10:08:13.696792 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e81cf43-7573-4691-878b-eeae474d75be-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:13 crc kubenswrapper[4693]: I1204 10:08:13.696804 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e81cf43-7573-4691-878b-eeae474d75be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.074690 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-x2n7g" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.076264 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-x2n7g" event={"ID":"5e81cf43-7573-4691-878b-eeae474d75be","Type":"ContainerDied","Data":"6dcfeccf963f436bc75a219e663bc7ab46d6307f1e58f4f8b9bd5d197ffe4025"} Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.076373 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dcfeccf963f436bc75a219e663bc7ab46d6307f1e58f4f8b9bd5d197ffe4025" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.149562 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 10:08:14 crc kubenswrapper[4693]: E1204 10:08:14.150153 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b95841c-9b4d-4f37-88f2-2f94dd1b57b0" containerName="dnsmasq-dns" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.150176 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b95841c-9b4d-4f37-88f2-2f94dd1b57b0" containerName="dnsmasq-dns" Dec 04 10:08:14 crc kubenswrapper[4693]: E1204 10:08:14.150203 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b95841c-9b4d-4f37-88f2-2f94dd1b57b0" containerName="init" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.150212 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b95841c-9b4d-4f37-88f2-2f94dd1b57b0" containerName="init" Dec 04 10:08:14 crc kubenswrapper[4693]: E1204 10:08:14.150252 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e81cf43-7573-4691-878b-eeae474d75be" containerName="nova-cell1-conductor-db-sync" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.150261 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e81cf43-7573-4691-878b-eeae474d75be" containerName="nova-cell1-conductor-db-sync" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.150499 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b95841c-9b4d-4f37-88f2-2f94dd1b57b0" containerName="dnsmasq-dns" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.150534 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e81cf43-7573-4691-878b-eeae474d75be" containerName="nova-cell1-conductor-db-sync" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.151425 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.155636 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.174404 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.207097 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cece472-f359-4ce8-b1f8-17ca920f4b3d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7cece472-f359-4ce8-b1f8-17ca920f4b3d\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.207171 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cece472-f359-4ce8-b1f8-17ca920f4b3d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7cece472-f359-4ce8-b1f8-17ca920f4b3d\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.207481 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6njss\" (UniqueName: \"kubernetes.io/projected/7cece472-f359-4ce8-b1f8-17ca920f4b3d-kube-api-access-6njss\") pod \"nova-cell1-conductor-0\" (UID: \"7cece472-f359-4ce8-b1f8-17ca920f4b3d\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.320547 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cece472-f359-4ce8-b1f8-17ca920f4b3d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7cece472-f359-4ce8-b1f8-17ca920f4b3d\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.321196 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6njss\" (UniqueName: \"kubernetes.io/projected/7cece472-f359-4ce8-b1f8-17ca920f4b3d-kube-api-access-6njss\") pod \"nova-cell1-conductor-0\" (UID: \"7cece472-f359-4ce8-b1f8-17ca920f4b3d\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.321542 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cece472-f359-4ce8-b1f8-17ca920f4b3d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7cece472-f359-4ce8-b1f8-17ca920f4b3d\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.329453 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cece472-f359-4ce8-b1f8-17ca920f4b3d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7cece472-f359-4ce8-b1f8-17ca920f4b3d\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.345273 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cece472-f359-4ce8-b1f8-17ca920f4b3d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7cece472-f359-4ce8-b1f8-17ca920f4b3d\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.346094 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6njss\" (UniqueName: \"kubernetes.io/projected/7cece472-f359-4ce8-b1f8-17ca920f4b3d-kube-api-access-6njss\") pod \"nova-cell1-conductor-0\" (UID: \"7cece472-f359-4ce8-b1f8-17ca920f4b3d\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.474605 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.483102 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.483171 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.622522 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sxjb6" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.629304 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xfxvd" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.703744 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sxjb6" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.730569 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713c9761-3dbd-4889-9678-9acee2bd6635-combined-ca-bundle\") pod \"713c9761-3dbd-4889-9678-9acee2bd6635\" (UID: \"713c9761-3dbd-4889-9678-9acee2bd6635\") " Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.730721 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/713c9761-3dbd-4889-9678-9acee2bd6635-scripts\") pod \"713c9761-3dbd-4889-9678-9acee2bd6635\" (UID: \"713c9761-3dbd-4889-9678-9acee2bd6635\") " Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.730831 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sxqg\" (UniqueName: \"kubernetes.io/projected/713c9761-3dbd-4889-9678-9acee2bd6635-kube-api-access-6sxqg\") pod \"713c9761-3dbd-4889-9678-9acee2bd6635\" (UID: \"713c9761-3dbd-4889-9678-9acee2bd6635\") " Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.730874 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/713c9761-3dbd-4889-9678-9acee2bd6635-config-data\") pod \"713c9761-3dbd-4889-9678-9acee2bd6635\" (UID: \"713c9761-3dbd-4889-9678-9acee2bd6635\") " Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.748955 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/713c9761-3dbd-4889-9678-9acee2bd6635-kube-api-access-6sxqg" (OuterVolumeSpecName: "kube-api-access-6sxqg") pod "713c9761-3dbd-4889-9678-9acee2bd6635" (UID: "713c9761-3dbd-4889-9678-9acee2bd6635"). InnerVolumeSpecName "kube-api-access-6sxqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.752757 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713c9761-3dbd-4889-9678-9acee2bd6635-scripts" (OuterVolumeSpecName: "scripts") pod "713c9761-3dbd-4889-9678-9acee2bd6635" (UID: "713c9761-3dbd-4889-9678-9acee2bd6635"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.759840 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713c9761-3dbd-4889-9678-9acee2bd6635-config-data" (OuterVolumeSpecName: "config-data") pod "713c9761-3dbd-4889-9678-9acee2bd6635" (UID: "713c9761-3dbd-4889-9678-9acee2bd6635"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.766527 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713c9761-3dbd-4889-9678-9acee2bd6635-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "713c9761-3dbd-4889-9678-9acee2bd6635" (UID: "713c9761-3dbd-4889-9678-9acee2bd6635"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.833657 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sxqg\" (UniqueName: \"kubernetes.io/projected/713c9761-3dbd-4889-9678-9acee2bd6635-kube-api-access-6sxqg\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.833716 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/713c9761-3dbd-4889-9678-9acee2bd6635-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.833734 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713c9761-3dbd-4889-9678-9acee2bd6635-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.833748 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/713c9761-3dbd-4889-9678-9acee2bd6635-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:14 crc kubenswrapper[4693]: I1204 10:08:14.862913 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxjb6"] Dec 04 10:08:15 crc kubenswrapper[4693]: I1204 10:08:15.000753 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 10:08:15 crc kubenswrapper[4693]: W1204 10:08:15.000859 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cece472_f359_4ce8_b1f8_17ca920f4b3d.slice/crio-a1fa3fcb94752e6b50b13dcf0c235a64ea76fb876a1bb3732ef875fd0b1e3feb WatchSource:0}: Error finding container a1fa3fcb94752e6b50b13dcf0c235a64ea76fb876a1bb3732ef875fd0b1e3feb: Status 404 returned error can't find the container with id a1fa3fcb94752e6b50b13dcf0c235a64ea76fb876a1bb3732ef875fd0b1e3feb Dec 04 10:08:15 crc kubenswrapper[4693]: I1204 10:08:15.088906 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xfxvd" event={"ID":"713c9761-3dbd-4889-9678-9acee2bd6635","Type":"ContainerDied","Data":"ceaca4d9f543e0026a58184b668420e84628867e57f517950f2218f3d4924a61"} Dec 04 10:08:15 crc kubenswrapper[4693]: I1204 10:08:15.089000 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ceaca4d9f543e0026a58184b668420e84628867e57f517950f2218f3d4924a61" Dec 04 10:08:15 crc kubenswrapper[4693]: I1204 10:08:15.088920 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xfxvd" Dec 04 10:08:15 crc kubenswrapper[4693]: I1204 10:08:15.092500 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7cece472-f359-4ce8-b1f8-17ca920f4b3d","Type":"ContainerStarted","Data":"a1fa3fcb94752e6b50b13dcf0c235a64ea76fb876a1bb3732ef875fd0b1e3feb"} Dec 04 10:08:15 crc kubenswrapper[4693]: I1204 10:08:15.292224 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:08:15 crc kubenswrapper[4693]: I1204 10:08:15.292857 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="56eb0da4-6210-4898-b5f8-4060701e29c9" containerName="nova-api-log" containerID="cri-o://286ab105107676d196f72cb11643b59723a166203a3bb27e88cf6416bfd3d959" gracePeriod=30 Dec 04 10:08:15 crc kubenswrapper[4693]: I1204 10:08:15.293101 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="56eb0da4-6210-4898-b5f8-4060701e29c9" containerName="nova-api-api" containerID="cri-o://7bf4699ea5fed8730eb2453f5ae73cbffa4a548d76da0063caf1c25f7a6a9907" gracePeriod=30 Dec 04 10:08:15 crc kubenswrapper[4693]: I1204 10:08:15.310076 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:08:15 crc kubenswrapper[4693]: I1204 10:08:15.310327 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7821c94d-7490-46ce-904f-ffc0e90b9b8b" containerName="nova-scheduler-scheduler" containerID="cri-o://294fd94cef5f6c8c0a6f1c9cce663e860de3c6f031162ba6bca61fb7015e9045" gracePeriod=30 Dec 04 10:08:15 crc kubenswrapper[4693]: I1204 10:08:15.332263 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:08:15 crc kubenswrapper[4693]: I1204 10:08:15.332571 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="37b4cd53-5335-452f-b7ad-34ababf00ab1" containerName="nova-metadata-log" containerID="cri-o://dc5f8cfd75ebdb744b7cb73fa8c7bd2a3c0f70fddbb2b0beb1530ac95f1f04d8" gracePeriod=30 Dec 04 10:08:15 crc kubenswrapper[4693]: I1204 10:08:15.332617 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="37b4cd53-5335-452f-b7ad-34ababf00ab1" containerName="nova-metadata-metadata" containerID="cri-o://e969b70b4b298392c6ab7a67e63d4da605a40b61eb3822ace00b85236d3b9cb7" gracePeriod=30 Dec 04 10:08:15 crc kubenswrapper[4693]: E1204 10:08:15.833670 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="294fd94cef5f6c8c0a6f1c9cce663e860de3c6f031162ba6bca61fb7015e9045" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 10:08:15 crc kubenswrapper[4693]: E1204 10:08:15.835493 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="294fd94cef5f6c8c0a6f1c9cce663e860de3c6f031162ba6bca61fb7015e9045" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 10:08:15 crc kubenswrapper[4693]: E1204 10:08:15.837129 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="294fd94cef5f6c8c0a6f1c9cce663e860de3c6f031162ba6bca61fb7015e9045" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 10:08:15 crc kubenswrapper[4693]: E1204 10:08:15.837161 4693 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="7821c94d-7490-46ce-904f-ffc0e90b9b8b" containerName="nova-scheduler-scheduler" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:15.967055 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.066741 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37b4cd53-5335-452f-b7ad-34ababf00ab1-combined-ca-bundle\") pod \"37b4cd53-5335-452f-b7ad-34ababf00ab1\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.066800 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37b4cd53-5335-452f-b7ad-34ababf00ab1-config-data\") pod \"37b4cd53-5335-452f-b7ad-34ababf00ab1\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.066943 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37b4cd53-5335-452f-b7ad-34ababf00ab1-nova-metadata-tls-certs\") pod \"37b4cd53-5335-452f-b7ad-34ababf00ab1\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.066975 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z75v2\" (UniqueName: \"kubernetes.io/projected/37b4cd53-5335-452f-b7ad-34ababf00ab1-kube-api-access-z75v2\") pod \"37b4cd53-5335-452f-b7ad-34ababf00ab1\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.067040 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37b4cd53-5335-452f-b7ad-34ababf00ab1-logs\") pod \"37b4cd53-5335-452f-b7ad-34ababf00ab1\" (UID: \"37b4cd53-5335-452f-b7ad-34ababf00ab1\") " Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.069405 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37b4cd53-5335-452f-b7ad-34ababf00ab1-logs" (OuterVolumeSpecName: "logs") pod "37b4cd53-5335-452f-b7ad-34ababf00ab1" (UID: "37b4cd53-5335-452f-b7ad-34ababf00ab1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.076465 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b4cd53-5335-452f-b7ad-34ababf00ab1-kube-api-access-z75v2" (OuterVolumeSpecName: "kube-api-access-z75v2") pod "37b4cd53-5335-452f-b7ad-34ababf00ab1" (UID: "37b4cd53-5335-452f-b7ad-34ababf00ab1"). InnerVolumeSpecName "kube-api-access-z75v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.102509 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37b4cd53-5335-452f-b7ad-34ababf00ab1-config-data" (OuterVolumeSpecName: "config-data") pod "37b4cd53-5335-452f-b7ad-34ababf00ab1" (UID: "37b4cd53-5335-452f-b7ad-34ababf00ab1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.104230 4693 generic.go:334] "Generic (PLEG): container finished" podID="37b4cd53-5335-452f-b7ad-34ababf00ab1" containerID="e969b70b4b298392c6ab7a67e63d4da605a40b61eb3822ace00b85236d3b9cb7" exitCode=0 Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.104253 4693 generic.go:334] "Generic (PLEG): container finished" podID="37b4cd53-5335-452f-b7ad-34ababf00ab1" containerID="dc5f8cfd75ebdb744b7cb73fa8c7bd2a3c0f70fddbb2b0beb1530ac95f1f04d8" exitCode=143 Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.104289 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37b4cd53-5335-452f-b7ad-34ababf00ab1","Type":"ContainerDied","Data":"e969b70b4b298392c6ab7a67e63d4da605a40b61eb3822ace00b85236d3b9cb7"} Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.104313 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37b4cd53-5335-452f-b7ad-34ababf00ab1","Type":"ContainerDied","Data":"dc5f8cfd75ebdb744b7cb73fa8c7bd2a3c0f70fddbb2b0beb1530ac95f1f04d8"} Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.104324 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37b4cd53-5335-452f-b7ad-34ababf00ab1","Type":"ContainerDied","Data":"6c2b7be26634475e317115c22b65a31f8872dd0259f0db282a6cbac75830b0c5"} Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.104354 4693 scope.go:117] "RemoveContainer" containerID="e969b70b4b298392c6ab7a67e63d4da605a40b61eb3822ace00b85236d3b9cb7" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.104485 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.106557 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7cece472-f359-4ce8-b1f8-17ca920f4b3d","Type":"ContainerStarted","Data":"16b9fcdff9e508b5608f7aa58718ec2ced3a6f68cb0aa70d77d9181f2523e5da"} Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.107561 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.109841 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37b4cd53-5335-452f-b7ad-34ababf00ab1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37b4cd53-5335-452f-b7ad-34ababf00ab1" (UID: "37b4cd53-5335-452f-b7ad-34ababf00ab1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.110501 4693 generic.go:334] "Generic (PLEG): container finished" podID="56eb0da4-6210-4898-b5f8-4060701e29c9" containerID="286ab105107676d196f72cb11643b59723a166203a3bb27e88cf6416bfd3d959" exitCode=143 Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.110650 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sxjb6" podUID="d364c298-a01c-42a4-bf43-83dfc48b54a7" containerName="registry-server" containerID="cri-o://1ee83f2a997426d36d43a63319f77ff342106b959cd15cf7e8e1580945e622d3" gracePeriod=2 Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.110940 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56eb0da4-6210-4898-b5f8-4060701e29c9","Type":"ContainerDied","Data":"286ab105107676d196f72cb11643b59723a166203a3bb27e88cf6416bfd3d959"} Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.132734 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.132718197 podStartE2EDuration="2.132718197s" podCreationTimestamp="2025-12-04 10:08:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:08:16.126796284 +0000 UTC m=+1542.024390037" watchObservedRunningTime="2025-12-04 10:08:16.132718197 +0000 UTC m=+1542.030311950" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.141733 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37b4cd53-5335-452f-b7ad-34ababf00ab1-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "37b4cd53-5335-452f-b7ad-34ababf00ab1" (UID: "37b4cd53-5335-452f-b7ad-34ababf00ab1"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.155400 4693 scope.go:117] "RemoveContainer" containerID="dc5f8cfd75ebdb744b7cb73fa8c7bd2a3c0f70fddbb2b0beb1530ac95f1f04d8" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.169547 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37b4cd53-5335-452f-b7ad-34ababf00ab1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.169575 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37b4cd53-5335-452f-b7ad-34ababf00ab1-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.169583 4693 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37b4cd53-5335-452f-b7ad-34ababf00ab1-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.169594 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z75v2\" (UniqueName: \"kubernetes.io/projected/37b4cd53-5335-452f-b7ad-34ababf00ab1-kube-api-access-z75v2\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.169603 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37b4cd53-5335-452f-b7ad-34ababf00ab1-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.221984 4693 scope.go:117] "RemoveContainer" containerID="e969b70b4b298392c6ab7a67e63d4da605a40b61eb3822ace00b85236d3b9cb7" Dec 04 10:08:16 crc kubenswrapper[4693]: E1204 10:08:16.222463 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e969b70b4b298392c6ab7a67e63d4da605a40b61eb3822ace00b85236d3b9cb7\": container with ID starting with e969b70b4b298392c6ab7a67e63d4da605a40b61eb3822ace00b85236d3b9cb7 not found: ID does not exist" containerID="e969b70b4b298392c6ab7a67e63d4da605a40b61eb3822ace00b85236d3b9cb7" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.222490 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e969b70b4b298392c6ab7a67e63d4da605a40b61eb3822ace00b85236d3b9cb7"} err="failed to get container status \"e969b70b4b298392c6ab7a67e63d4da605a40b61eb3822ace00b85236d3b9cb7\": rpc error: code = NotFound desc = could not find container \"e969b70b4b298392c6ab7a67e63d4da605a40b61eb3822ace00b85236d3b9cb7\": container with ID starting with e969b70b4b298392c6ab7a67e63d4da605a40b61eb3822ace00b85236d3b9cb7 not found: ID does not exist" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.222510 4693 scope.go:117] "RemoveContainer" containerID="dc5f8cfd75ebdb744b7cb73fa8c7bd2a3c0f70fddbb2b0beb1530ac95f1f04d8" Dec 04 10:08:16 crc kubenswrapper[4693]: E1204 10:08:16.222875 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc5f8cfd75ebdb744b7cb73fa8c7bd2a3c0f70fddbb2b0beb1530ac95f1f04d8\": container with ID starting with dc5f8cfd75ebdb744b7cb73fa8c7bd2a3c0f70fddbb2b0beb1530ac95f1f04d8 not found: ID does not exist" containerID="dc5f8cfd75ebdb744b7cb73fa8c7bd2a3c0f70fddbb2b0beb1530ac95f1f04d8" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.222922 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc5f8cfd75ebdb744b7cb73fa8c7bd2a3c0f70fddbb2b0beb1530ac95f1f04d8"} err="failed to get container status \"dc5f8cfd75ebdb744b7cb73fa8c7bd2a3c0f70fddbb2b0beb1530ac95f1f04d8\": rpc error: code = NotFound desc = could not find container \"dc5f8cfd75ebdb744b7cb73fa8c7bd2a3c0f70fddbb2b0beb1530ac95f1f04d8\": container with ID starting with dc5f8cfd75ebdb744b7cb73fa8c7bd2a3c0f70fddbb2b0beb1530ac95f1f04d8 not found: ID does not exist" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.222942 4693 scope.go:117] "RemoveContainer" containerID="e969b70b4b298392c6ab7a67e63d4da605a40b61eb3822ace00b85236d3b9cb7" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.223191 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e969b70b4b298392c6ab7a67e63d4da605a40b61eb3822ace00b85236d3b9cb7"} err="failed to get container status \"e969b70b4b298392c6ab7a67e63d4da605a40b61eb3822ace00b85236d3b9cb7\": rpc error: code = NotFound desc = could not find container \"e969b70b4b298392c6ab7a67e63d4da605a40b61eb3822ace00b85236d3b9cb7\": container with ID starting with e969b70b4b298392c6ab7a67e63d4da605a40b61eb3822ace00b85236d3b9cb7 not found: ID does not exist" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.223209 4693 scope.go:117] "RemoveContainer" containerID="dc5f8cfd75ebdb744b7cb73fa8c7bd2a3c0f70fddbb2b0beb1530ac95f1f04d8" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.223534 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc5f8cfd75ebdb744b7cb73fa8c7bd2a3c0f70fddbb2b0beb1530ac95f1f04d8"} err="failed to get container status \"dc5f8cfd75ebdb744b7cb73fa8c7bd2a3c0f70fddbb2b0beb1530ac95f1f04d8\": rpc error: code = NotFound desc = could not find container \"dc5f8cfd75ebdb744b7cb73fa8c7bd2a3c0f70fddbb2b0beb1530ac95f1f04d8\": container with ID starting with dc5f8cfd75ebdb744b7cb73fa8c7bd2a3c0f70fddbb2b0beb1530ac95f1f04d8 not found: ID does not exist" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.510575 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.524102 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.557917 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:08:16 crc kubenswrapper[4693]: E1204 10:08:16.558592 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b4cd53-5335-452f-b7ad-34ababf00ab1" containerName="nova-metadata-metadata" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.558616 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b4cd53-5335-452f-b7ad-34ababf00ab1" containerName="nova-metadata-metadata" Dec 04 10:08:16 crc kubenswrapper[4693]: E1204 10:08:16.558647 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b4cd53-5335-452f-b7ad-34ababf00ab1" containerName="nova-metadata-log" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.558656 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b4cd53-5335-452f-b7ad-34ababf00ab1" containerName="nova-metadata-log" Dec 04 10:08:16 crc kubenswrapper[4693]: E1204 10:08:16.558692 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713c9761-3dbd-4889-9678-9acee2bd6635" containerName="nova-manage" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.558701 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="713c9761-3dbd-4889-9678-9acee2bd6635" containerName="nova-manage" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.558962 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b4cd53-5335-452f-b7ad-34ababf00ab1" containerName="nova-metadata-log" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.558982 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="713c9761-3dbd-4889-9678-9acee2bd6635" containerName="nova-manage" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.558998 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b4cd53-5335-452f-b7ad-34ababf00ab1" containerName="nova-metadata-metadata" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.560302 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.564321 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.564462 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.576900 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.580864 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf7hz\" (UniqueName: \"kubernetes.io/projected/c416e59a-b555-493d-8171-f6bbfc91c7a3-kube-api-access-nf7hz\") pod \"nova-metadata-0\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " pod="openstack/nova-metadata-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.581032 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c416e59a-b555-493d-8171-f6bbfc91c7a3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " pod="openstack/nova-metadata-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.581136 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c416e59a-b555-493d-8171-f6bbfc91c7a3-logs\") pod \"nova-metadata-0\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " pod="openstack/nova-metadata-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.581233 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c416e59a-b555-493d-8171-f6bbfc91c7a3-config-data\") pod \"nova-metadata-0\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " pod="openstack/nova-metadata-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.581270 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c416e59a-b555-493d-8171-f6bbfc91c7a3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " pod="openstack/nova-metadata-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.683719 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c416e59a-b555-493d-8171-f6bbfc91c7a3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " pod="openstack/nova-metadata-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.683821 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c416e59a-b555-493d-8171-f6bbfc91c7a3-logs\") pod \"nova-metadata-0\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " pod="openstack/nova-metadata-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.683883 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c416e59a-b555-493d-8171-f6bbfc91c7a3-config-data\") pod \"nova-metadata-0\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " pod="openstack/nova-metadata-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.683913 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c416e59a-b555-493d-8171-f6bbfc91c7a3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " pod="openstack/nova-metadata-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.683978 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf7hz\" (UniqueName: \"kubernetes.io/projected/c416e59a-b555-493d-8171-f6bbfc91c7a3-kube-api-access-nf7hz\") pod \"nova-metadata-0\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " pod="openstack/nova-metadata-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.687776 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c416e59a-b555-493d-8171-f6bbfc91c7a3-logs\") pod \"nova-metadata-0\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " pod="openstack/nova-metadata-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.689053 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c416e59a-b555-493d-8171-f6bbfc91c7a3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " pod="openstack/nova-metadata-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.690382 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c416e59a-b555-493d-8171-f6bbfc91c7a3-config-data\") pod \"nova-metadata-0\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " pod="openstack/nova-metadata-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.691838 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c416e59a-b555-493d-8171-f6bbfc91c7a3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " pod="openstack/nova-metadata-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.707325 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf7hz\" (UniqueName: \"kubernetes.io/projected/c416e59a-b555-493d-8171-f6bbfc91c7a3-kube-api-access-nf7hz\") pod \"nova-metadata-0\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " pod="openstack/nova-metadata-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.887777 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.900212 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxjb6" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.990838 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d364c298-a01c-42a4-bf43-83dfc48b54a7-catalog-content\") pod \"d364c298-a01c-42a4-bf43-83dfc48b54a7\" (UID: \"d364c298-a01c-42a4-bf43-83dfc48b54a7\") " Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.990887 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d364c298-a01c-42a4-bf43-83dfc48b54a7-utilities\") pod \"d364c298-a01c-42a4-bf43-83dfc48b54a7\" (UID: \"d364c298-a01c-42a4-bf43-83dfc48b54a7\") " Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.990914 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr8wc\" (UniqueName: \"kubernetes.io/projected/d364c298-a01c-42a4-bf43-83dfc48b54a7-kube-api-access-mr8wc\") pod \"d364c298-a01c-42a4-bf43-83dfc48b54a7\" (UID: \"d364c298-a01c-42a4-bf43-83dfc48b54a7\") " Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.992202 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d364c298-a01c-42a4-bf43-83dfc48b54a7-utilities" (OuterVolumeSpecName: "utilities") pod "d364c298-a01c-42a4-bf43-83dfc48b54a7" (UID: "d364c298-a01c-42a4-bf43-83dfc48b54a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:08:16 crc kubenswrapper[4693]: I1204 10:08:16.995220 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d364c298-a01c-42a4-bf43-83dfc48b54a7-kube-api-access-mr8wc" (OuterVolumeSpecName: "kube-api-access-mr8wc") pod "d364c298-a01c-42a4-bf43-83dfc48b54a7" (UID: "d364c298-a01c-42a4-bf43-83dfc48b54a7"). InnerVolumeSpecName "kube-api-access-mr8wc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.019674 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d364c298-a01c-42a4-bf43-83dfc48b54a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d364c298-a01c-42a4-bf43-83dfc48b54a7" (UID: "d364c298-a01c-42a4-bf43-83dfc48b54a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.096857 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d364c298-a01c-42a4-bf43-83dfc48b54a7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.096900 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d364c298-a01c-42a4-bf43-83dfc48b54a7-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.096915 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr8wc\" (UniqueName: \"kubernetes.io/projected/d364c298-a01c-42a4-bf43-83dfc48b54a7-kube-api-access-mr8wc\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.136054 4693 generic.go:334] "Generic (PLEG): container finished" podID="d364c298-a01c-42a4-bf43-83dfc48b54a7" containerID="1ee83f2a997426d36d43a63319f77ff342106b959cd15cf7e8e1580945e622d3" exitCode=0 Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.136143 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxjb6" Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.136142 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxjb6" event={"ID":"d364c298-a01c-42a4-bf43-83dfc48b54a7","Type":"ContainerDied","Data":"1ee83f2a997426d36d43a63319f77ff342106b959cd15cf7e8e1580945e622d3"} Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.136274 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxjb6" event={"ID":"d364c298-a01c-42a4-bf43-83dfc48b54a7","Type":"ContainerDied","Data":"8f16f43a18a54ebe99069ba7059cddd88de91d995d17b2d66e2d5d399b60157b"} Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.136295 4693 scope.go:117] "RemoveContainer" containerID="1ee83f2a997426d36d43a63319f77ff342106b959cd15cf7e8e1580945e622d3" Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.186548 4693 scope.go:117] "RemoveContainer" containerID="7901083fe2eadfc3514cf20fa8746e63d270dbcdb5452146a7d3cda7778aadde" Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.201040 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxjb6"] Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.216025 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxjb6"] Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.216650 4693 scope.go:117] "RemoveContainer" containerID="c31564f1ea1b7a210dda17317f9304524b925df727b23e1eba0a40e456a8c28d" Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.250780 4693 scope.go:117] "RemoveContainer" containerID="1ee83f2a997426d36d43a63319f77ff342106b959cd15cf7e8e1580945e622d3" Dec 04 10:08:17 crc kubenswrapper[4693]: E1204 10:08:17.251522 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ee83f2a997426d36d43a63319f77ff342106b959cd15cf7e8e1580945e622d3\": container with ID starting with 1ee83f2a997426d36d43a63319f77ff342106b959cd15cf7e8e1580945e622d3 not found: ID does not exist" containerID="1ee83f2a997426d36d43a63319f77ff342106b959cd15cf7e8e1580945e622d3" Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.251563 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ee83f2a997426d36d43a63319f77ff342106b959cd15cf7e8e1580945e622d3"} err="failed to get container status \"1ee83f2a997426d36d43a63319f77ff342106b959cd15cf7e8e1580945e622d3\": rpc error: code = NotFound desc = could not find container \"1ee83f2a997426d36d43a63319f77ff342106b959cd15cf7e8e1580945e622d3\": container with ID starting with 1ee83f2a997426d36d43a63319f77ff342106b959cd15cf7e8e1580945e622d3 not found: ID does not exist" Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.251592 4693 scope.go:117] "RemoveContainer" containerID="7901083fe2eadfc3514cf20fa8746e63d270dbcdb5452146a7d3cda7778aadde" Dec 04 10:08:17 crc kubenswrapper[4693]: E1204 10:08:17.252018 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7901083fe2eadfc3514cf20fa8746e63d270dbcdb5452146a7d3cda7778aadde\": container with ID starting with 7901083fe2eadfc3514cf20fa8746e63d270dbcdb5452146a7d3cda7778aadde not found: ID does not exist" containerID="7901083fe2eadfc3514cf20fa8746e63d270dbcdb5452146a7d3cda7778aadde" Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.252102 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7901083fe2eadfc3514cf20fa8746e63d270dbcdb5452146a7d3cda7778aadde"} err="failed to get container status \"7901083fe2eadfc3514cf20fa8746e63d270dbcdb5452146a7d3cda7778aadde\": rpc error: code = NotFound desc = could not find container \"7901083fe2eadfc3514cf20fa8746e63d270dbcdb5452146a7d3cda7778aadde\": container with ID starting with 7901083fe2eadfc3514cf20fa8746e63d270dbcdb5452146a7d3cda7778aadde not found: ID does not exist" Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.252159 4693 scope.go:117] "RemoveContainer" containerID="c31564f1ea1b7a210dda17317f9304524b925df727b23e1eba0a40e456a8c28d" Dec 04 10:08:17 crc kubenswrapper[4693]: E1204 10:08:17.253877 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c31564f1ea1b7a210dda17317f9304524b925df727b23e1eba0a40e456a8c28d\": container with ID starting with c31564f1ea1b7a210dda17317f9304524b925df727b23e1eba0a40e456a8c28d not found: ID does not exist" containerID="c31564f1ea1b7a210dda17317f9304524b925df727b23e1eba0a40e456a8c28d" Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.253937 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c31564f1ea1b7a210dda17317f9304524b925df727b23e1eba0a40e456a8c28d"} err="failed to get container status \"c31564f1ea1b7a210dda17317f9304524b925df727b23e1eba0a40e456a8c28d\": rpc error: code = NotFound desc = could not find container \"c31564f1ea1b7a210dda17317f9304524b925df727b23e1eba0a40e456a8c28d\": container with ID starting with c31564f1ea1b7a210dda17317f9304524b925df727b23e1eba0a40e456a8c28d not found: ID does not exist" Dec 04 10:08:17 crc kubenswrapper[4693]: W1204 10:08:17.353020 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc416e59a_b555_493d_8171_f6bbfc91c7a3.slice/crio-264e6f4f4efa87a31bf872c2f3c21f9765ba82a8fb2b69f22ed6dd0700287c68 WatchSource:0}: Error finding container 264e6f4f4efa87a31bf872c2f3c21f9765ba82a8fb2b69f22ed6dd0700287c68: Status 404 returned error can't find the container with id 264e6f4f4efa87a31bf872c2f3c21f9765ba82a8fb2b69f22ed6dd0700287c68 Dec 04 10:08:17 crc kubenswrapper[4693]: I1204 10:08:17.353654 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:08:18 crc kubenswrapper[4693]: I1204 10:08:18.150545 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c416e59a-b555-493d-8171-f6bbfc91c7a3","Type":"ContainerStarted","Data":"cbc77ffebe84e8c9c9b5bcf9d2b075445168f6f350ebaf4cdb881652ec338c6b"} Dec 04 10:08:18 crc kubenswrapper[4693]: I1204 10:08:18.150628 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c416e59a-b555-493d-8171-f6bbfc91c7a3","Type":"ContainerStarted","Data":"95f4429c6fa53858daf344ce3c7e9872d146bc7800c44f64671f28050dccb9e9"} Dec 04 10:08:18 crc kubenswrapper[4693]: I1204 10:08:18.150642 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c416e59a-b555-493d-8171-f6bbfc91c7a3","Type":"ContainerStarted","Data":"264e6f4f4efa87a31bf872c2f3c21f9765ba82a8fb2b69f22ed6dd0700287c68"} Dec 04 10:08:18 crc kubenswrapper[4693]: I1204 10:08:18.174943 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.174912564 podStartE2EDuration="2.174912564s" podCreationTimestamp="2025-12-04 10:08:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:08:18.173417352 +0000 UTC m=+1544.071011105" watchObservedRunningTime="2025-12-04 10:08:18.174912564 +0000 UTC m=+1544.072506357" Dec 04 10:08:18 crc kubenswrapper[4693]: I1204 10:08:18.473753 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37b4cd53-5335-452f-b7ad-34ababf00ab1" path="/var/lib/kubelet/pods/37b4cd53-5335-452f-b7ad-34ababf00ab1/volumes" Dec 04 10:08:18 crc kubenswrapper[4693]: I1204 10:08:18.474987 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d364c298-a01c-42a4-bf43-83dfc48b54a7" path="/var/lib/kubelet/pods/d364c298-a01c-42a4-bf43-83dfc48b54a7/volumes" Dec 04 10:08:18 crc kubenswrapper[4693]: I1204 10:08:18.973285 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.043238 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56eb0da4-6210-4898-b5f8-4060701e29c9-logs\") pod \"56eb0da4-6210-4898-b5f8-4060701e29c9\" (UID: \"56eb0da4-6210-4898-b5f8-4060701e29c9\") " Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.043612 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56eb0da4-6210-4898-b5f8-4060701e29c9-config-data\") pod \"56eb0da4-6210-4898-b5f8-4060701e29c9\" (UID: \"56eb0da4-6210-4898-b5f8-4060701e29c9\") " Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.043648 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56eb0da4-6210-4898-b5f8-4060701e29c9-combined-ca-bundle\") pod \"56eb0da4-6210-4898-b5f8-4060701e29c9\" (UID: \"56eb0da4-6210-4898-b5f8-4060701e29c9\") " Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.043678 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbnxx\" (UniqueName: \"kubernetes.io/projected/56eb0da4-6210-4898-b5f8-4060701e29c9-kube-api-access-xbnxx\") pod \"56eb0da4-6210-4898-b5f8-4060701e29c9\" (UID: \"56eb0da4-6210-4898-b5f8-4060701e29c9\") " Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.046529 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56eb0da4-6210-4898-b5f8-4060701e29c9-logs" (OuterVolumeSpecName: "logs") pod "56eb0da4-6210-4898-b5f8-4060701e29c9" (UID: "56eb0da4-6210-4898-b5f8-4060701e29c9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.070355 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56eb0da4-6210-4898-b5f8-4060701e29c9-kube-api-access-xbnxx" (OuterVolumeSpecName: "kube-api-access-xbnxx") pod "56eb0da4-6210-4898-b5f8-4060701e29c9" (UID: "56eb0da4-6210-4898-b5f8-4060701e29c9"). InnerVolumeSpecName "kube-api-access-xbnxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.089762 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56eb0da4-6210-4898-b5f8-4060701e29c9-config-data" (OuterVolumeSpecName: "config-data") pod "56eb0da4-6210-4898-b5f8-4060701e29c9" (UID: "56eb0da4-6210-4898-b5f8-4060701e29c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.112002 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56eb0da4-6210-4898-b5f8-4060701e29c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56eb0da4-6210-4898-b5f8-4060701e29c9" (UID: "56eb0da4-6210-4898-b5f8-4060701e29c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.127959 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.144994 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7821c94d-7490-46ce-904f-ffc0e90b9b8b-config-data\") pod \"7821c94d-7490-46ce-904f-ffc0e90b9b8b\" (UID: \"7821c94d-7490-46ce-904f-ffc0e90b9b8b\") " Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.145212 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc89b\" (UniqueName: \"kubernetes.io/projected/7821c94d-7490-46ce-904f-ffc0e90b9b8b-kube-api-access-kc89b\") pod \"7821c94d-7490-46ce-904f-ffc0e90b9b8b\" (UID: \"7821c94d-7490-46ce-904f-ffc0e90b9b8b\") " Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.145348 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7821c94d-7490-46ce-904f-ffc0e90b9b8b-combined-ca-bundle\") pod \"7821c94d-7490-46ce-904f-ffc0e90b9b8b\" (UID: \"7821c94d-7490-46ce-904f-ffc0e90b9b8b\") " Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.145765 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56eb0da4-6210-4898-b5f8-4060701e29c9-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.145788 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56eb0da4-6210-4898-b5f8-4060701e29c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.145799 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbnxx\" (UniqueName: \"kubernetes.io/projected/56eb0da4-6210-4898-b5f8-4060701e29c9-kube-api-access-xbnxx\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.145809 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56eb0da4-6210-4898-b5f8-4060701e29c9-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.149569 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7821c94d-7490-46ce-904f-ffc0e90b9b8b-kube-api-access-kc89b" (OuterVolumeSpecName: "kube-api-access-kc89b") pod "7821c94d-7490-46ce-904f-ffc0e90b9b8b" (UID: "7821c94d-7490-46ce-904f-ffc0e90b9b8b"). InnerVolumeSpecName "kube-api-access-kc89b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.176938 4693 generic.go:334] "Generic (PLEG): container finished" podID="56eb0da4-6210-4898-b5f8-4060701e29c9" containerID="7bf4699ea5fed8730eb2453f5ae73cbffa4a548d76da0063caf1c25f7a6a9907" exitCode=0 Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.176983 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56eb0da4-6210-4898-b5f8-4060701e29c9","Type":"ContainerDied","Data":"7bf4699ea5fed8730eb2453f5ae73cbffa4a548d76da0063caf1c25f7a6a9907"} Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.178282 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56eb0da4-6210-4898-b5f8-4060701e29c9","Type":"ContainerDied","Data":"d7b199d2ede5a4820e44ec24141bdcec02c2d8c4b9d644585d182c83efd50459"} Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.177015 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.178355 4693 scope.go:117] "RemoveContainer" containerID="7bf4699ea5fed8730eb2453f5ae73cbffa4a548d76da0063caf1c25f7a6a9907" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.185848 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7821c94d-7490-46ce-904f-ffc0e90b9b8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7821c94d-7490-46ce-904f-ffc0e90b9b8b" (UID: "7821c94d-7490-46ce-904f-ffc0e90b9b8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.189028 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7821c94d-7490-46ce-904f-ffc0e90b9b8b-config-data" (OuterVolumeSpecName: "config-data") pod "7821c94d-7490-46ce-904f-ffc0e90b9b8b" (UID: "7821c94d-7490-46ce-904f-ffc0e90b9b8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.189781 4693 generic.go:334] "Generic (PLEG): container finished" podID="7821c94d-7490-46ce-904f-ffc0e90b9b8b" containerID="294fd94cef5f6c8c0a6f1c9cce663e860de3c6f031162ba6bca61fb7015e9045" exitCode=0 Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.190199 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.191651 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7821c94d-7490-46ce-904f-ffc0e90b9b8b","Type":"ContainerDied","Data":"294fd94cef5f6c8c0a6f1c9cce663e860de3c6f031162ba6bca61fb7015e9045"} Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.191688 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7821c94d-7490-46ce-904f-ffc0e90b9b8b","Type":"ContainerDied","Data":"398f2e501c18887bf17717f26540922f164c414976a14277a73dc88aa85b1894"} Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.209757 4693 scope.go:117] "RemoveContainer" containerID="286ab105107676d196f72cb11643b59723a166203a3bb27e88cf6416bfd3d959" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.227923 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.235181 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.247524 4693 scope.go:117] "RemoveContainer" containerID="7bf4699ea5fed8730eb2453f5ae73cbffa4a548d76da0063caf1c25f7a6a9907" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.248617 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7821c94d-7490-46ce-904f-ffc0e90b9b8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.248689 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7821c94d-7490-46ce-904f-ffc0e90b9b8b-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.248740 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc89b\" (UniqueName: \"kubernetes.io/projected/7821c94d-7490-46ce-904f-ffc0e90b9b8b-kube-api-access-kc89b\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:19 crc kubenswrapper[4693]: E1204 10:08:19.249451 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bf4699ea5fed8730eb2453f5ae73cbffa4a548d76da0063caf1c25f7a6a9907\": container with ID starting with 7bf4699ea5fed8730eb2453f5ae73cbffa4a548d76da0063caf1c25f7a6a9907 not found: ID does not exist" containerID="7bf4699ea5fed8730eb2453f5ae73cbffa4a548d76da0063caf1c25f7a6a9907" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.249518 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bf4699ea5fed8730eb2453f5ae73cbffa4a548d76da0063caf1c25f7a6a9907"} err="failed to get container status \"7bf4699ea5fed8730eb2453f5ae73cbffa4a548d76da0063caf1c25f7a6a9907\": rpc error: code = NotFound desc = could not find container \"7bf4699ea5fed8730eb2453f5ae73cbffa4a548d76da0063caf1c25f7a6a9907\": container with ID starting with 7bf4699ea5fed8730eb2453f5ae73cbffa4a548d76da0063caf1c25f7a6a9907 not found: ID does not exist" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.249551 4693 scope.go:117] "RemoveContainer" containerID="286ab105107676d196f72cb11643b59723a166203a3bb27e88cf6416bfd3d959" Dec 04 10:08:19 crc kubenswrapper[4693]: E1204 10:08:19.251861 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"286ab105107676d196f72cb11643b59723a166203a3bb27e88cf6416bfd3d959\": container with ID starting with 286ab105107676d196f72cb11643b59723a166203a3bb27e88cf6416bfd3d959 not found: ID does not exist" containerID="286ab105107676d196f72cb11643b59723a166203a3bb27e88cf6416bfd3d959" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.251912 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"286ab105107676d196f72cb11643b59723a166203a3bb27e88cf6416bfd3d959"} err="failed to get container status \"286ab105107676d196f72cb11643b59723a166203a3bb27e88cf6416bfd3d959\": rpc error: code = NotFound desc = could not find container \"286ab105107676d196f72cb11643b59723a166203a3bb27e88cf6416bfd3d959\": container with ID starting with 286ab105107676d196f72cb11643b59723a166203a3bb27e88cf6416bfd3d959 not found: ID does not exist" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.251937 4693 scope.go:117] "RemoveContainer" containerID="294fd94cef5f6c8c0a6f1c9cce663e860de3c6f031162ba6bca61fb7015e9045" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.275422 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.285425 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 10:08:19 crc kubenswrapper[4693]: E1204 10:08:19.285961 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56eb0da4-6210-4898-b5f8-4060701e29c9" containerName="nova-api-api" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.285982 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="56eb0da4-6210-4898-b5f8-4060701e29c9" containerName="nova-api-api" Dec 04 10:08:19 crc kubenswrapper[4693]: E1204 10:08:19.286004 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d364c298-a01c-42a4-bf43-83dfc48b54a7" containerName="extract-content" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.286014 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d364c298-a01c-42a4-bf43-83dfc48b54a7" containerName="extract-content" Dec 04 10:08:19 crc kubenswrapper[4693]: E1204 10:08:19.286046 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d364c298-a01c-42a4-bf43-83dfc48b54a7" containerName="extract-utilities" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.286055 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d364c298-a01c-42a4-bf43-83dfc48b54a7" containerName="extract-utilities" Dec 04 10:08:19 crc kubenswrapper[4693]: E1204 10:08:19.286075 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7821c94d-7490-46ce-904f-ffc0e90b9b8b" containerName="nova-scheduler-scheduler" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.286084 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7821c94d-7490-46ce-904f-ffc0e90b9b8b" containerName="nova-scheduler-scheduler" Dec 04 10:08:19 crc kubenswrapper[4693]: E1204 10:08:19.286105 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56eb0da4-6210-4898-b5f8-4060701e29c9" containerName="nova-api-log" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.286114 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="56eb0da4-6210-4898-b5f8-4060701e29c9" containerName="nova-api-log" Dec 04 10:08:19 crc kubenswrapper[4693]: E1204 10:08:19.286132 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d364c298-a01c-42a4-bf43-83dfc48b54a7" containerName="registry-server" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.286139 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d364c298-a01c-42a4-bf43-83dfc48b54a7" containerName="registry-server" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.286445 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="56eb0da4-6210-4898-b5f8-4060701e29c9" containerName="nova-api-api" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.286470 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="7821c94d-7490-46ce-904f-ffc0e90b9b8b" containerName="nova-scheduler-scheduler" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.286487 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d364c298-a01c-42a4-bf43-83dfc48b54a7" containerName="registry-server" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.286504 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="56eb0da4-6210-4898-b5f8-4060701e29c9" containerName="nova-api-log" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.288089 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.293757 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.293957 4693 scope.go:117] "RemoveContainer" containerID="294fd94cef5f6c8c0a6f1c9cce663e860de3c6f031162ba6bca61fb7015e9045" Dec 04 10:08:19 crc kubenswrapper[4693]: E1204 10:08:19.297989 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"294fd94cef5f6c8c0a6f1c9cce663e860de3c6f031162ba6bca61fb7015e9045\": container with ID starting with 294fd94cef5f6c8c0a6f1c9cce663e860de3c6f031162ba6bca61fb7015e9045 not found: ID does not exist" containerID="294fd94cef5f6c8c0a6f1c9cce663e860de3c6f031162ba6bca61fb7015e9045" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.298039 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294fd94cef5f6c8c0a6f1c9cce663e860de3c6f031162ba6bca61fb7015e9045"} err="failed to get container status \"294fd94cef5f6c8c0a6f1c9cce663e860de3c6f031162ba6bca61fb7015e9045\": rpc error: code = NotFound desc = could not find container \"294fd94cef5f6c8c0a6f1c9cce663e860de3c6f031162ba6bca61fb7015e9045\": container with ID starting with 294fd94cef5f6c8c0a6f1c9cce663e860de3c6f031162ba6bca61fb7015e9045 not found: ID does not exist" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.298254 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.309270 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.321308 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.322689 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.325259 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.331440 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.354360 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f92e892-caed-4dd8-ae71-62e8d5be9d56-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0f92e892-caed-4dd8-ae71-62e8d5be9d56\") " pod="openstack/nova-scheduler-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.354458 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8tkn\" (UniqueName: \"kubernetes.io/projected/0f92e892-caed-4dd8-ae71-62e8d5be9d56-kube-api-access-r8tkn\") pod \"nova-scheduler-0\" (UID: \"0f92e892-caed-4dd8-ae71-62e8d5be9d56\") " pod="openstack/nova-scheduler-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.354499 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e4345aa-2c11-4997-ba04-11bf02f29c2f-logs\") pod \"nova-api-0\" (UID: \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\") " pod="openstack/nova-api-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.354525 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b24d\" (UniqueName: \"kubernetes.io/projected/8e4345aa-2c11-4997-ba04-11bf02f29c2f-kube-api-access-6b24d\") pod \"nova-api-0\" (UID: \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\") " pod="openstack/nova-api-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.354620 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f92e892-caed-4dd8-ae71-62e8d5be9d56-config-data\") pod \"nova-scheduler-0\" (UID: \"0f92e892-caed-4dd8-ae71-62e8d5be9d56\") " pod="openstack/nova-scheduler-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.354639 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e4345aa-2c11-4997-ba04-11bf02f29c2f-config-data\") pod \"nova-api-0\" (UID: \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\") " pod="openstack/nova-api-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.354672 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e4345aa-2c11-4997-ba04-11bf02f29c2f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\") " pod="openstack/nova-api-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.457200 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8tkn\" (UniqueName: \"kubernetes.io/projected/0f92e892-caed-4dd8-ae71-62e8d5be9d56-kube-api-access-r8tkn\") pod \"nova-scheduler-0\" (UID: \"0f92e892-caed-4dd8-ae71-62e8d5be9d56\") " pod="openstack/nova-scheduler-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.457243 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e4345aa-2c11-4997-ba04-11bf02f29c2f-logs\") pod \"nova-api-0\" (UID: \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\") " pod="openstack/nova-api-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.457267 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b24d\" (UniqueName: \"kubernetes.io/projected/8e4345aa-2c11-4997-ba04-11bf02f29c2f-kube-api-access-6b24d\") pod \"nova-api-0\" (UID: \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\") " pod="openstack/nova-api-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.457367 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f92e892-caed-4dd8-ae71-62e8d5be9d56-config-data\") pod \"nova-scheduler-0\" (UID: \"0f92e892-caed-4dd8-ae71-62e8d5be9d56\") " pod="openstack/nova-scheduler-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.457390 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e4345aa-2c11-4997-ba04-11bf02f29c2f-config-data\") pod \"nova-api-0\" (UID: \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\") " pod="openstack/nova-api-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.457446 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e4345aa-2c11-4997-ba04-11bf02f29c2f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\") " pod="openstack/nova-api-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.457528 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f92e892-caed-4dd8-ae71-62e8d5be9d56-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0f92e892-caed-4dd8-ae71-62e8d5be9d56\") " pod="openstack/nova-scheduler-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.458712 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e4345aa-2c11-4997-ba04-11bf02f29c2f-logs\") pod \"nova-api-0\" (UID: \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\") " pod="openstack/nova-api-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.465052 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e4345aa-2c11-4997-ba04-11bf02f29c2f-config-data\") pod \"nova-api-0\" (UID: \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\") " pod="openstack/nova-api-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.472110 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f92e892-caed-4dd8-ae71-62e8d5be9d56-config-data\") pod \"nova-scheduler-0\" (UID: \"0f92e892-caed-4dd8-ae71-62e8d5be9d56\") " pod="openstack/nova-scheduler-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.472827 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e4345aa-2c11-4997-ba04-11bf02f29c2f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\") " pod="openstack/nova-api-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.473433 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f92e892-caed-4dd8-ae71-62e8d5be9d56-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0f92e892-caed-4dd8-ae71-62e8d5be9d56\") " pod="openstack/nova-scheduler-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.488978 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8tkn\" (UniqueName: \"kubernetes.io/projected/0f92e892-caed-4dd8-ae71-62e8d5be9d56-kube-api-access-r8tkn\") pod \"nova-scheduler-0\" (UID: \"0f92e892-caed-4dd8-ae71-62e8d5be9d56\") " pod="openstack/nova-scheduler-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.490113 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b24d\" (UniqueName: \"kubernetes.io/projected/8e4345aa-2c11-4997-ba04-11bf02f29c2f-kube-api-access-6b24d\") pod \"nova-api-0\" (UID: \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\") " pod="openstack/nova-api-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.618430 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:08:19 crc kubenswrapper[4693]: I1204 10:08:19.638968 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:08:20 crc kubenswrapper[4693]: I1204 10:08:20.132900 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:08:20 crc kubenswrapper[4693]: W1204 10:08:20.139449 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f92e892_caed_4dd8_ae71_62e8d5be9d56.slice/crio-f595e41308fa5c8bf4fce8699c854362991ad7bfa14a5185abf4ba28a4585c10 WatchSource:0}: Error finding container f595e41308fa5c8bf4fce8699c854362991ad7bfa14a5185abf4ba28a4585c10: Status 404 returned error can't find the container with id f595e41308fa5c8bf4fce8699c854362991ad7bfa14a5185abf4ba28a4585c10 Dec 04 10:08:20 crc kubenswrapper[4693]: I1204 10:08:20.140694 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:08:20 crc kubenswrapper[4693]: W1204 10:08:20.141177 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e4345aa_2c11_4997_ba04_11bf02f29c2f.slice/crio-1fa0579755712c1e0f1f7904e42d460d2135e90f04aa05c578970ebb4de29806 WatchSource:0}: Error finding container 1fa0579755712c1e0f1f7904e42d460d2135e90f04aa05c578970ebb4de29806: Status 404 returned error can't find the container with id 1fa0579755712c1e0f1f7904e42d460d2135e90f04aa05c578970ebb4de29806 Dec 04 10:08:20 crc kubenswrapper[4693]: I1204 10:08:20.206813 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8e4345aa-2c11-4997-ba04-11bf02f29c2f","Type":"ContainerStarted","Data":"1fa0579755712c1e0f1f7904e42d460d2135e90f04aa05c578970ebb4de29806"} Dec 04 10:08:20 crc kubenswrapper[4693]: I1204 10:08:20.210522 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0f92e892-caed-4dd8-ae71-62e8d5be9d56","Type":"ContainerStarted","Data":"f595e41308fa5c8bf4fce8699c854362991ad7bfa14a5185abf4ba28a4585c10"} Dec 04 10:08:20 crc kubenswrapper[4693]: I1204 10:08:20.476307 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56eb0da4-6210-4898-b5f8-4060701e29c9" path="/var/lib/kubelet/pods/56eb0da4-6210-4898-b5f8-4060701e29c9/volumes" Dec 04 10:08:20 crc kubenswrapper[4693]: I1204 10:08:20.477008 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7821c94d-7490-46ce-904f-ffc0e90b9b8b" path="/var/lib/kubelet/pods/7821c94d-7490-46ce-904f-ffc0e90b9b8b/volumes" Dec 04 10:08:21 crc kubenswrapper[4693]: I1204 10:08:21.245865 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8e4345aa-2c11-4997-ba04-11bf02f29c2f","Type":"ContainerStarted","Data":"95fc2a134476ba7d14ce63a92396ac4d0bbf14a3107d2b8a6c0011526a46469a"} Dec 04 10:08:21 crc kubenswrapper[4693]: I1204 10:08:21.246399 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8e4345aa-2c11-4997-ba04-11bf02f29c2f","Type":"ContainerStarted","Data":"ca3879cfb9a2e7178a9ac9b97772e313b971db30b0c129230d7182e860995a3e"} Dec 04 10:08:21 crc kubenswrapper[4693]: I1204 10:08:21.254430 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0f92e892-caed-4dd8-ae71-62e8d5be9d56","Type":"ContainerStarted","Data":"e6995bf5d59c84b0b9bbd35f223a9113981613da0f6f10e07e7223fa08a9796f"} Dec 04 10:08:21 crc kubenswrapper[4693]: I1204 10:08:21.283058 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.283035469 podStartE2EDuration="2.283035469s" podCreationTimestamp="2025-12-04 10:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:08:21.278895974 +0000 UTC m=+1547.176489727" watchObservedRunningTime="2025-12-04 10:08:21.283035469 +0000 UTC m=+1547.180629232" Dec 04 10:08:21 crc kubenswrapper[4693]: I1204 10:08:21.305860 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.305839648 podStartE2EDuration="2.305839648s" podCreationTimestamp="2025-12-04 10:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:08:21.30043685 +0000 UTC m=+1547.198030633" watchObservedRunningTime="2025-12-04 10:08:21.305839648 +0000 UTC m=+1547.203433401" Dec 04 10:08:21 crc kubenswrapper[4693]: I1204 10:08:21.889896 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 10:08:21 crc kubenswrapper[4693]: I1204 10:08:21.890277 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 10:08:23 crc kubenswrapper[4693]: I1204 10:08:23.856940 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 10:08:24 crc kubenswrapper[4693]: I1204 10:08:24.515979 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 04 10:08:24 crc kubenswrapper[4693]: I1204 10:08:24.640124 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 10:08:26 crc kubenswrapper[4693]: I1204 10:08:26.890007 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 10:08:26 crc kubenswrapper[4693]: I1204 10:08:26.890612 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 10:08:27 crc kubenswrapper[4693]: I1204 10:08:27.558177 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:08:27 crc kubenswrapper[4693]: I1204 10:08:27.558504 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="2b831c57-110e-406f-b9a9-3c619add6639" containerName="kube-state-metrics" containerID="cri-o://657412118b9663aab5061cb1b3c5230f02ee3f5574496056f0ac2aa33cc465b1" gracePeriod=30 Dec 04 10:08:27 crc kubenswrapper[4693]: I1204 10:08:27.906583 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c416e59a-b555-493d-8171-f6bbfc91c7a3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:08:27 crc kubenswrapper[4693]: I1204 10:08:27.906548 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c416e59a-b555-493d-8171-f6bbfc91c7a3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.036444 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.209708 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsvpw\" (UniqueName: \"kubernetes.io/projected/2b831c57-110e-406f-b9a9-3c619add6639-kube-api-access-vsvpw\") pod \"2b831c57-110e-406f-b9a9-3c619add6639\" (UID: \"2b831c57-110e-406f-b9a9-3c619add6639\") " Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.217720 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b831c57-110e-406f-b9a9-3c619add6639-kube-api-access-vsvpw" (OuterVolumeSpecName: "kube-api-access-vsvpw") pod "2b831c57-110e-406f-b9a9-3c619add6639" (UID: "2b831c57-110e-406f-b9a9-3c619add6639"). InnerVolumeSpecName "kube-api-access-vsvpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.312240 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsvpw\" (UniqueName: \"kubernetes.io/projected/2b831c57-110e-406f-b9a9-3c619add6639-kube-api-access-vsvpw\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.326568 4693 generic.go:334] "Generic (PLEG): container finished" podID="2b831c57-110e-406f-b9a9-3c619add6639" containerID="657412118b9663aab5061cb1b3c5230f02ee3f5574496056f0ac2aa33cc465b1" exitCode=2 Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.326627 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.326656 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2b831c57-110e-406f-b9a9-3c619add6639","Type":"ContainerDied","Data":"657412118b9663aab5061cb1b3c5230f02ee3f5574496056f0ac2aa33cc465b1"} Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.327090 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2b831c57-110e-406f-b9a9-3c619add6639","Type":"ContainerDied","Data":"1a98bd828ff539b9c30bfede6829c87d64dc982eb84378c6b50dd439c9c5a330"} Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.327111 4693 scope.go:117] "RemoveContainer" containerID="657412118b9663aab5061cb1b3c5230f02ee3f5574496056f0ac2aa33cc465b1" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.372049 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.379123 4693 scope.go:117] "RemoveContainer" containerID="657412118b9663aab5061cb1b3c5230f02ee3f5574496056f0ac2aa33cc465b1" Dec 04 10:08:28 crc kubenswrapper[4693]: E1204 10:08:28.379662 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"657412118b9663aab5061cb1b3c5230f02ee3f5574496056f0ac2aa33cc465b1\": container with ID starting with 657412118b9663aab5061cb1b3c5230f02ee3f5574496056f0ac2aa33cc465b1 not found: ID does not exist" containerID="657412118b9663aab5061cb1b3c5230f02ee3f5574496056f0ac2aa33cc465b1" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.379702 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"657412118b9663aab5061cb1b3c5230f02ee3f5574496056f0ac2aa33cc465b1"} err="failed to get container status \"657412118b9663aab5061cb1b3c5230f02ee3f5574496056f0ac2aa33cc465b1\": rpc error: code = NotFound desc = could not find container \"657412118b9663aab5061cb1b3c5230f02ee3f5574496056f0ac2aa33cc465b1\": container with ID starting with 657412118b9663aab5061cb1b3c5230f02ee3f5574496056f0ac2aa33cc465b1 not found: ID does not exist" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.380253 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.391410 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:08:28 crc kubenswrapper[4693]: E1204 10:08:28.392135 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b831c57-110e-406f-b9a9-3c619add6639" containerName="kube-state-metrics" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.392158 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b831c57-110e-406f-b9a9-3c619add6639" containerName="kube-state-metrics" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.392407 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b831c57-110e-406f-b9a9-3c619add6639" containerName="kube-state-metrics" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.393393 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.397391 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.397547 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.401994 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.413474 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4f9ccd88-ae2a-4026-b492-a09d29799c89-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4f9ccd88-ae2a-4026-b492-a09d29799c89\") " pod="openstack/kube-state-metrics-0" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.413600 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9ccd88-ae2a-4026-b492-a09d29799c89-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4f9ccd88-ae2a-4026-b492-a09d29799c89\") " pod="openstack/kube-state-metrics-0" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.413668 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f9ccd88-ae2a-4026-b492-a09d29799c89-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4f9ccd88-ae2a-4026-b492-a09d29799c89\") " pod="openstack/kube-state-metrics-0" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.413705 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9tc5\" (UniqueName: \"kubernetes.io/projected/4f9ccd88-ae2a-4026-b492-a09d29799c89-kube-api-access-n9tc5\") pod \"kube-state-metrics-0\" (UID: \"4f9ccd88-ae2a-4026-b492-a09d29799c89\") " pod="openstack/kube-state-metrics-0" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.477974 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b831c57-110e-406f-b9a9-3c619add6639" path="/var/lib/kubelet/pods/2b831c57-110e-406f-b9a9-3c619add6639/volumes" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.517266 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f9ccd88-ae2a-4026-b492-a09d29799c89-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4f9ccd88-ae2a-4026-b492-a09d29799c89\") " pod="openstack/kube-state-metrics-0" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.517786 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9tc5\" (UniqueName: \"kubernetes.io/projected/4f9ccd88-ae2a-4026-b492-a09d29799c89-kube-api-access-n9tc5\") pod \"kube-state-metrics-0\" (UID: \"4f9ccd88-ae2a-4026-b492-a09d29799c89\") " pod="openstack/kube-state-metrics-0" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.517947 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4f9ccd88-ae2a-4026-b492-a09d29799c89-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4f9ccd88-ae2a-4026-b492-a09d29799c89\") " pod="openstack/kube-state-metrics-0" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.520091 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9ccd88-ae2a-4026-b492-a09d29799c89-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4f9ccd88-ae2a-4026-b492-a09d29799c89\") " pod="openstack/kube-state-metrics-0" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.522699 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4f9ccd88-ae2a-4026-b492-a09d29799c89-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4f9ccd88-ae2a-4026-b492-a09d29799c89\") " pod="openstack/kube-state-metrics-0" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.524617 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9ccd88-ae2a-4026-b492-a09d29799c89-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4f9ccd88-ae2a-4026-b492-a09d29799c89\") " pod="openstack/kube-state-metrics-0" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.537695 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f9ccd88-ae2a-4026-b492-a09d29799c89-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4f9ccd88-ae2a-4026-b492-a09d29799c89\") " pod="openstack/kube-state-metrics-0" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.540173 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9tc5\" (UniqueName: \"kubernetes.io/projected/4f9ccd88-ae2a-4026-b492-a09d29799c89-kube-api-access-n9tc5\") pod \"kube-state-metrics-0\" (UID: \"4f9ccd88-ae2a-4026-b492-a09d29799c89\") " pod="openstack/kube-state-metrics-0" Dec 04 10:08:28 crc kubenswrapper[4693]: I1204 10:08:28.722350 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 10:08:29 crc kubenswrapper[4693]: I1204 10:08:29.254504 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:08:29 crc kubenswrapper[4693]: W1204 10:08:29.265387 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f9ccd88_ae2a_4026_b492_a09d29799c89.slice/crio-a1d9de02bcf96f94e0b373f216549622c225ae05aa4b62404c80ece8c4bd1789 WatchSource:0}: Error finding container a1d9de02bcf96f94e0b373f216549622c225ae05aa4b62404c80ece8c4bd1789: Status 404 returned error can't find the container with id a1d9de02bcf96f94e0b373f216549622c225ae05aa4b62404c80ece8c4bd1789 Dec 04 10:08:29 crc kubenswrapper[4693]: I1204 10:08:29.337372 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4f9ccd88-ae2a-4026-b492-a09d29799c89","Type":"ContainerStarted","Data":"a1d9de02bcf96f94e0b373f216549622c225ae05aa4b62404c80ece8c4bd1789"} Dec 04 10:08:29 crc kubenswrapper[4693]: I1204 10:08:29.501256 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:08:29 crc kubenswrapper[4693]: I1204 10:08:29.502108 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerName="sg-core" containerID="cri-o://7e79d4b6d6bec855023e6f520ca1e051fc560bb74f8958199f0dde21ab4b9468" gracePeriod=30 Dec 04 10:08:29 crc kubenswrapper[4693]: I1204 10:08:29.502606 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerName="ceilometer-central-agent" containerID="cri-o://d8c9bb19dee4349f66194c0a6debad613c409466ee83a3698de4dec1b13225fc" gracePeriod=30 Dec 04 10:08:29 crc kubenswrapper[4693]: I1204 10:08:29.502211 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerName="proxy-httpd" containerID="cri-o://6c51bc5d682a3515d8b07da5460199c9ff98fde234a191e7fa161d0804f5782f" gracePeriod=30 Dec 04 10:08:29 crc kubenswrapper[4693]: I1204 10:08:29.502261 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerName="ceilometer-notification-agent" containerID="cri-o://f233924ccc9fbd9a122ef22bd55fbc01087fdfc468230439b79408240d470d59" gracePeriod=30 Dec 04 10:08:29 crc kubenswrapper[4693]: I1204 10:08:29.619486 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:08:29 crc kubenswrapper[4693]: I1204 10:08:29.619878 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:08:29 crc kubenswrapper[4693]: I1204 10:08:29.639147 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 10:08:29 crc kubenswrapper[4693]: I1204 10:08:29.676929 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 10:08:30 crc kubenswrapper[4693]: I1204 10:08:30.358889 4693 generic.go:334] "Generic (PLEG): container finished" podID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerID="6c51bc5d682a3515d8b07da5460199c9ff98fde234a191e7fa161d0804f5782f" exitCode=0 Dec 04 10:08:30 crc kubenswrapper[4693]: I1204 10:08:30.359217 4693 generic.go:334] "Generic (PLEG): container finished" podID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerID="7e79d4b6d6bec855023e6f520ca1e051fc560bb74f8958199f0dde21ab4b9468" exitCode=2 Dec 04 10:08:30 crc kubenswrapper[4693]: I1204 10:08:30.359239 4693 generic.go:334] "Generic (PLEG): container finished" podID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerID="d8c9bb19dee4349f66194c0a6debad613c409466ee83a3698de4dec1b13225fc" exitCode=0 Dec 04 10:08:30 crc kubenswrapper[4693]: I1204 10:08:30.359382 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e","Type":"ContainerDied","Data":"6c51bc5d682a3515d8b07da5460199c9ff98fde234a191e7fa161d0804f5782f"} Dec 04 10:08:30 crc kubenswrapper[4693]: I1204 10:08:30.359451 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e","Type":"ContainerDied","Data":"7e79d4b6d6bec855023e6f520ca1e051fc560bb74f8958199f0dde21ab4b9468"} Dec 04 10:08:30 crc kubenswrapper[4693]: I1204 10:08:30.359476 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e","Type":"ContainerDied","Data":"d8c9bb19dee4349f66194c0a6debad613c409466ee83a3698de4dec1b13225fc"} Dec 04 10:08:30 crc kubenswrapper[4693]: I1204 10:08:30.364019 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4f9ccd88-ae2a-4026-b492-a09d29799c89","Type":"ContainerStarted","Data":"dd7189018d985e0019aab8e74a8755aa7303b93a16da5ef3fed03a6becf6cc5c"} Dec 04 10:08:30 crc kubenswrapper[4693]: I1204 10:08:30.364082 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 04 10:08:30 crc kubenswrapper[4693]: I1204 10:08:30.395026 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 10:08:30 crc kubenswrapper[4693]: I1204 10:08:30.398714 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.037732425 podStartE2EDuration="2.398694118s" podCreationTimestamp="2025-12-04 10:08:28 +0000 UTC" firstStartedPulling="2025-12-04 10:08:29.269561054 +0000 UTC m=+1555.167154817" lastFinishedPulling="2025-12-04 10:08:29.630522757 +0000 UTC m=+1555.528116510" observedRunningTime="2025-12-04 10:08:30.392130927 +0000 UTC m=+1556.289724680" watchObservedRunningTime="2025-12-04 10:08:30.398694118 +0000 UTC m=+1556.296287881" Dec 04 10:08:30 crc kubenswrapper[4693]: I1204 10:08:30.702577 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8e4345aa-2c11-4997-ba04-11bf02f29c2f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.210:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:08:30 crc kubenswrapper[4693]: I1204 10:08:30.702684 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8e4345aa-2c11-4997-ba04-11bf02f29c2f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.210:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.156211 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tj5xn"] Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.158728 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tj5xn" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.165686 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tj5xn"] Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.284513 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59447c1-4fc2-4413-a040-8b71e5b10885-utilities\") pod \"community-operators-tj5xn\" (UID: \"e59447c1-4fc2-4413-a040-8b71e5b10885\") " pod="openshift-marketplace/community-operators-tj5xn" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.284572 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dflm9\" (UniqueName: \"kubernetes.io/projected/e59447c1-4fc2-4413-a040-8b71e5b10885-kube-api-access-dflm9\") pod \"community-operators-tj5xn\" (UID: \"e59447c1-4fc2-4413-a040-8b71e5b10885\") " pod="openshift-marketplace/community-operators-tj5xn" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.284617 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.284855 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59447c1-4fc2-4413-a040-8b71e5b10885-catalog-content\") pod \"community-operators-tj5xn\" (UID: \"e59447c1-4fc2-4413-a040-8b71e5b10885\") " pod="openshift-marketplace/community-operators-tj5xn" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.379058 4693 generic.go:334] "Generic (PLEG): container finished" podID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerID="f233924ccc9fbd9a122ef22bd55fbc01087fdfc468230439b79408240d470d59" exitCode=0 Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.379122 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.379169 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e","Type":"ContainerDied","Data":"f233924ccc9fbd9a122ef22bd55fbc01087fdfc468230439b79408240d470d59"} Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.379201 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e","Type":"ContainerDied","Data":"84f0a047e2f91d2135153974ad4f9a451d61656437a01f1bf38987c396a53402"} Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.379220 4693 scope.go:117] "RemoveContainer" containerID="6c51bc5d682a3515d8b07da5460199c9ff98fde234a191e7fa161d0804f5782f" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.385749 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-sg-core-conf-yaml\") pod \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.385856 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-log-httpd\") pod \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.385928 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-config-data\") pod \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.385945 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-scripts\") pod \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.386033 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vstf\" (UniqueName: \"kubernetes.io/projected/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-kube-api-access-5vstf\") pod \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.386059 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-run-httpd\") pod \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.386086 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-combined-ca-bundle\") pod \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\" (UID: \"9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e\") " Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.386317 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59447c1-4fc2-4413-a040-8b71e5b10885-utilities\") pod \"community-operators-tj5xn\" (UID: \"e59447c1-4fc2-4413-a040-8b71e5b10885\") " pod="openshift-marketplace/community-operators-tj5xn" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.386365 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dflm9\" (UniqueName: \"kubernetes.io/projected/e59447c1-4fc2-4413-a040-8b71e5b10885-kube-api-access-dflm9\") pod \"community-operators-tj5xn\" (UID: \"e59447c1-4fc2-4413-a040-8b71e5b10885\") " pod="openshift-marketplace/community-operators-tj5xn" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.386490 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59447c1-4fc2-4413-a040-8b71e5b10885-catalog-content\") pod \"community-operators-tj5xn\" (UID: \"e59447c1-4fc2-4413-a040-8b71e5b10885\") " pod="openshift-marketplace/community-operators-tj5xn" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.386984 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59447c1-4fc2-4413-a040-8b71e5b10885-catalog-content\") pod \"community-operators-tj5xn\" (UID: \"e59447c1-4fc2-4413-a040-8b71e5b10885\") " pod="openshift-marketplace/community-operators-tj5xn" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.387646 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" (UID: "9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.387679 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59447c1-4fc2-4413-a040-8b71e5b10885-utilities\") pod \"community-operators-tj5xn\" (UID: \"e59447c1-4fc2-4413-a040-8b71e5b10885\") " pod="openshift-marketplace/community-operators-tj5xn" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.387692 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" (UID: "9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.392108 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-kube-api-access-5vstf" (OuterVolumeSpecName: "kube-api-access-5vstf") pod "9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" (UID: "9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e"). InnerVolumeSpecName "kube-api-access-5vstf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.392517 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-scripts" (OuterVolumeSpecName: "scripts") pod "9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" (UID: "9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.407446 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dflm9\" (UniqueName: \"kubernetes.io/projected/e59447c1-4fc2-4413-a040-8b71e5b10885-kube-api-access-dflm9\") pod \"community-operators-tj5xn\" (UID: \"e59447c1-4fc2-4413-a040-8b71e5b10885\") " pod="openshift-marketplace/community-operators-tj5xn" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.448134 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" (UID: "9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.489825 4693 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.489886 4693 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.489897 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.489908 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vstf\" (UniqueName: \"kubernetes.io/projected/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-kube-api-access-5vstf\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.489918 4693 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.525671 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" (UID: "9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.539518 4693 scope.go:117] "RemoveContainer" containerID="7e79d4b6d6bec855023e6f520ca1e051fc560bb74f8958199f0dde21ab4b9468" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.567826 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-config-data" (OuterVolumeSpecName: "config-data") pod "9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" (UID: "9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.568122 4693 scope.go:117] "RemoveContainer" containerID="f233924ccc9fbd9a122ef22bd55fbc01087fdfc468230439b79408240d470d59" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.581372 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tj5xn" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.590806 4693 scope.go:117] "RemoveContainer" containerID="d8c9bb19dee4349f66194c0a6debad613c409466ee83a3698de4dec1b13225fc" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.592721 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.592753 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.635062 4693 scope.go:117] "RemoveContainer" containerID="6c51bc5d682a3515d8b07da5460199c9ff98fde234a191e7fa161d0804f5782f" Dec 04 10:08:31 crc kubenswrapper[4693]: E1204 10:08:31.635665 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c51bc5d682a3515d8b07da5460199c9ff98fde234a191e7fa161d0804f5782f\": container with ID starting with 6c51bc5d682a3515d8b07da5460199c9ff98fde234a191e7fa161d0804f5782f not found: ID does not exist" containerID="6c51bc5d682a3515d8b07da5460199c9ff98fde234a191e7fa161d0804f5782f" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.635705 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c51bc5d682a3515d8b07da5460199c9ff98fde234a191e7fa161d0804f5782f"} err="failed to get container status \"6c51bc5d682a3515d8b07da5460199c9ff98fde234a191e7fa161d0804f5782f\": rpc error: code = NotFound desc = could not find container \"6c51bc5d682a3515d8b07da5460199c9ff98fde234a191e7fa161d0804f5782f\": container with ID starting with 6c51bc5d682a3515d8b07da5460199c9ff98fde234a191e7fa161d0804f5782f not found: ID does not exist" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.635742 4693 scope.go:117] "RemoveContainer" containerID="7e79d4b6d6bec855023e6f520ca1e051fc560bb74f8958199f0dde21ab4b9468" Dec 04 10:08:31 crc kubenswrapper[4693]: E1204 10:08:31.636407 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e79d4b6d6bec855023e6f520ca1e051fc560bb74f8958199f0dde21ab4b9468\": container with ID starting with 7e79d4b6d6bec855023e6f520ca1e051fc560bb74f8958199f0dde21ab4b9468 not found: ID does not exist" containerID="7e79d4b6d6bec855023e6f520ca1e051fc560bb74f8958199f0dde21ab4b9468" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.636489 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e79d4b6d6bec855023e6f520ca1e051fc560bb74f8958199f0dde21ab4b9468"} err="failed to get container status \"7e79d4b6d6bec855023e6f520ca1e051fc560bb74f8958199f0dde21ab4b9468\": rpc error: code = NotFound desc = could not find container \"7e79d4b6d6bec855023e6f520ca1e051fc560bb74f8958199f0dde21ab4b9468\": container with ID starting with 7e79d4b6d6bec855023e6f520ca1e051fc560bb74f8958199f0dde21ab4b9468 not found: ID does not exist" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.636541 4693 scope.go:117] "RemoveContainer" containerID="f233924ccc9fbd9a122ef22bd55fbc01087fdfc468230439b79408240d470d59" Dec 04 10:08:31 crc kubenswrapper[4693]: E1204 10:08:31.636912 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f233924ccc9fbd9a122ef22bd55fbc01087fdfc468230439b79408240d470d59\": container with ID starting with f233924ccc9fbd9a122ef22bd55fbc01087fdfc468230439b79408240d470d59 not found: ID does not exist" containerID="f233924ccc9fbd9a122ef22bd55fbc01087fdfc468230439b79408240d470d59" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.636955 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f233924ccc9fbd9a122ef22bd55fbc01087fdfc468230439b79408240d470d59"} err="failed to get container status \"f233924ccc9fbd9a122ef22bd55fbc01087fdfc468230439b79408240d470d59\": rpc error: code = NotFound desc = could not find container \"f233924ccc9fbd9a122ef22bd55fbc01087fdfc468230439b79408240d470d59\": container with ID starting with f233924ccc9fbd9a122ef22bd55fbc01087fdfc468230439b79408240d470d59 not found: ID does not exist" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.637002 4693 scope.go:117] "RemoveContainer" containerID="d8c9bb19dee4349f66194c0a6debad613c409466ee83a3698de4dec1b13225fc" Dec 04 10:08:31 crc kubenswrapper[4693]: E1204 10:08:31.637375 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8c9bb19dee4349f66194c0a6debad613c409466ee83a3698de4dec1b13225fc\": container with ID starting with d8c9bb19dee4349f66194c0a6debad613c409466ee83a3698de4dec1b13225fc not found: ID does not exist" containerID="d8c9bb19dee4349f66194c0a6debad613c409466ee83a3698de4dec1b13225fc" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.637409 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8c9bb19dee4349f66194c0a6debad613c409466ee83a3698de4dec1b13225fc"} err="failed to get container status \"d8c9bb19dee4349f66194c0a6debad613c409466ee83a3698de4dec1b13225fc\": rpc error: code = NotFound desc = could not find container \"d8c9bb19dee4349f66194c0a6debad613c409466ee83a3698de4dec1b13225fc\": container with ID starting with d8c9bb19dee4349f66194c0a6debad613c409466ee83a3698de4dec1b13225fc not found: ID does not exist" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.726553 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.740268 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.757566 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:08:31 crc kubenswrapper[4693]: E1204 10:08:31.758055 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerName="ceilometer-central-agent" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.758067 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerName="ceilometer-central-agent" Dec 04 10:08:31 crc kubenswrapper[4693]: E1204 10:08:31.758100 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerName="proxy-httpd" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.758107 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerName="proxy-httpd" Dec 04 10:08:31 crc kubenswrapper[4693]: E1204 10:08:31.758118 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerName="sg-core" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.758126 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerName="sg-core" Dec 04 10:08:31 crc kubenswrapper[4693]: E1204 10:08:31.758147 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerName="ceilometer-notification-agent" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.758153 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerName="ceilometer-notification-agent" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.758318 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerName="proxy-httpd" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.758335 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerName="ceilometer-notification-agent" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.758410 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerName="ceilometer-central-agent" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.758429 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" containerName="sg-core" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.760221 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.765563 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.766421 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.779920 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.785628 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.797853 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.798550 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-run-httpd\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.798578 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.798608 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-scripts\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.798647 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-log-httpd\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.798849 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-config-data\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.798950 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.799267 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2lm6\" (UniqueName: \"kubernetes.io/projected/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-kube-api-access-j2lm6\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.904009 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-scripts\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.904100 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-log-httpd\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.904157 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-config-data\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.904196 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.904275 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2lm6\" (UniqueName: \"kubernetes.io/projected/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-kube-api-access-j2lm6\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.904315 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.904379 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-run-httpd\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.904405 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.905064 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-log-httpd\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.905892 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-run-httpd\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.913798 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-config-data\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.914252 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.925024 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-scripts\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.928881 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.929557 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:31 crc kubenswrapper[4693]: I1204 10:08:31.934297 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2lm6\" (UniqueName: \"kubernetes.io/projected/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-kube-api-access-j2lm6\") pod \"ceilometer-0\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " pod="openstack/ceilometer-0" Dec 04 10:08:32 crc kubenswrapper[4693]: I1204 10:08:32.087894 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:08:32 crc kubenswrapper[4693]: I1204 10:08:32.173164 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tj5xn"] Dec 04 10:08:32 crc kubenswrapper[4693]: I1204 10:08:32.392648 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tj5xn" event={"ID":"e59447c1-4fc2-4413-a040-8b71e5b10885","Type":"ContainerStarted","Data":"45373636ff9c9484cc419b075508812ecc6829cf541d89c4e62a18b3986d0ff2"} Dec 04 10:08:32 crc kubenswrapper[4693]: I1204 10:08:32.392997 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tj5xn" event={"ID":"e59447c1-4fc2-4413-a040-8b71e5b10885","Type":"ContainerStarted","Data":"b4df9ddf138023a426041ccda68944e73ddd7134973d1d05bcebc6d3b57c0f0a"} Dec 04 10:08:32 crc kubenswrapper[4693]: I1204 10:08:32.473564 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e" path="/var/lib/kubelet/pods/9cee53ad-8b44-48e5-8a8f-d4fd8bf4998e/volumes" Dec 04 10:08:32 crc kubenswrapper[4693]: I1204 10:08:32.662542 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:08:33 crc kubenswrapper[4693]: I1204 10:08:33.407909 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac5a8f66-35e3-4976-b13a-8bef98f9bae1","Type":"ContainerStarted","Data":"23227702628dbcaf94962975e42226b7462b6e1ab8241ac7b9ecb78568d7faa2"} Dec 04 10:08:33 crc kubenswrapper[4693]: I1204 10:08:33.412597 4693 generic.go:334] "Generic (PLEG): container finished" podID="e59447c1-4fc2-4413-a040-8b71e5b10885" containerID="45373636ff9c9484cc419b075508812ecc6829cf541d89c4e62a18b3986d0ff2" exitCode=0 Dec 04 10:08:33 crc kubenswrapper[4693]: I1204 10:08:33.412706 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tj5xn" event={"ID":"e59447c1-4fc2-4413-a040-8b71e5b10885","Type":"ContainerDied","Data":"45373636ff9c9484cc419b075508812ecc6829cf541d89c4e62a18b3986d0ff2"} Dec 04 10:08:34 crc kubenswrapper[4693]: I1204 10:08:34.425333 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac5a8f66-35e3-4976-b13a-8bef98f9bae1","Type":"ContainerStarted","Data":"cde29142c7adfaf3359a1e4fc08ecb4821daab851200c725563f16007b2c3f66"} Dec 04 10:08:35 crc kubenswrapper[4693]: I1204 10:08:35.435736 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac5a8f66-35e3-4976-b13a-8bef98f9bae1","Type":"ContainerStarted","Data":"014f7fc4082950f0e345ec8bd5fdbd7538b67cbcb765e7598305764c041eb545"} Dec 04 10:08:35 crc kubenswrapper[4693]: I1204 10:08:35.437704 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tj5xn" event={"ID":"e59447c1-4fc2-4413-a040-8b71e5b10885","Type":"ContainerStarted","Data":"9879f3a20a34c62137bc929161ee97db701e3f45af077656557dc4c9e6013d80"} Dec 04 10:08:36 crc kubenswrapper[4693]: I1204 10:08:36.449888 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac5a8f66-35e3-4976-b13a-8bef98f9bae1","Type":"ContainerStarted","Data":"e8de25da0b606ad52fc02077b8bd5695ff0a7ed808c034c905112349ff9d7b1e"} Dec 04 10:08:36 crc kubenswrapper[4693]: I1204 10:08:36.452249 4693 generic.go:334] "Generic (PLEG): container finished" podID="e59447c1-4fc2-4413-a040-8b71e5b10885" containerID="9879f3a20a34c62137bc929161ee97db701e3f45af077656557dc4c9e6013d80" exitCode=0 Dec 04 10:08:36 crc kubenswrapper[4693]: I1204 10:08:36.452286 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tj5xn" event={"ID":"e59447c1-4fc2-4413-a040-8b71e5b10885","Type":"ContainerDied","Data":"9879f3a20a34c62137bc929161ee97db701e3f45af077656557dc4c9e6013d80"} Dec 04 10:08:36 crc kubenswrapper[4693]: I1204 10:08:36.898487 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 10:08:36 crc kubenswrapper[4693]: I1204 10:08:36.900586 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 10:08:36 crc kubenswrapper[4693]: I1204 10:08:36.922412 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 10:08:37 crc kubenswrapper[4693]: I1204 10:08:37.467238 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tj5xn" event={"ID":"e59447c1-4fc2-4413-a040-8b71e5b10885","Type":"ContainerStarted","Data":"42ee71988c018ccd36b935a5486fdb7296fdd1f3519dc53541965d322759f56d"} Dec 04 10:08:37 crc kubenswrapper[4693]: I1204 10:08:37.479343 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 10:08:37 crc kubenswrapper[4693]: I1204 10:08:37.531998 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tj5xn" podStartSLOduration=3.082852457 podStartE2EDuration="6.531982642s" podCreationTimestamp="2025-12-04 10:08:31 +0000 UTC" firstStartedPulling="2025-12-04 10:08:33.41511906 +0000 UTC m=+1559.312712813" lastFinishedPulling="2025-12-04 10:08:36.864249245 +0000 UTC m=+1562.761842998" observedRunningTime="2025-12-04 10:08:37.495444962 +0000 UTC m=+1563.393038705" watchObservedRunningTime="2025-12-04 10:08:37.531982642 +0000 UTC m=+1563.429576395" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.299470 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.445480 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55cc9d0-bc14-45af-978e-d72660ebcda0-combined-ca-bundle\") pod \"d55cc9d0-bc14-45af-978e-d72660ebcda0\" (UID: \"d55cc9d0-bc14-45af-978e-d72660ebcda0\") " Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.445521 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf8pb\" (UniqueName: \"kubernetes.io/projected/d55cc9d0-bc14-45af-978e-d72660ebcda0-kube-api-access-lf8pb\") pod \"d55cc9d0-bc14-45af-978e-d72660ebcda0\" (UID: \"d55cc9d0-bc14-45af-978e-d72660ebcda0\") " Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.445595 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d55cc9d0-bc14-45af-978e-d72660ebcda0-config-data\") pod \"d55cc9d0-bc14-45af-978e-d72660ebcda0\" (UID: \"d55cc9d0-bc14-45af-978e-d72660ebcda0\") " Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.450462 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d55cc9d0-bc14-45af-978e-d72660ebcda0-kube-api-access-lf8pb" (OuterVolumeSpecName: "kube-api-access-lf8pb") pod "d55cc9d0-bc14-45af-978e-d72660ebcda0" (UID: "d55cc9d0-bc14-45af-978e-d72660ebcda0"). InnerVolumeSpecName "kube-api-access-lf8pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.496967 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac5a8f66-35e3-4976-b13a-8bef98f9bae1","Type":"ContainerStarted","Data":"d74f286ed71479d4afcddbf5e4152d692b7017a911c80c55f0c897daf9947146"} Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.498680 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.499988 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d55cc9d0-bc14-45af-978e-d72660ebcda0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d55cc9d0-bc14-45af-978e-d72660ebcda0" (UID: "d55cc9d0-bc14-45af-978e-d72660ebcda0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.505905 4693 generic.go:334] "Generic (PLEG): container finished" podID="d55cc9d0-bc14-45af-978e-d72660ebcda0" containerID="0954176a47ed9aadcdaa9a1b3c9ee4d75440ddc0c641f5299d984ce7687c347f" exitCode=137 Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.506573 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.506680 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d55cc9d0-bc14-45af-978e-d72660ebcda0","Type":"ContainerDied","Data":"0954176a47ed9aadcdaa9a1b3c9ee4d75440ddc0c641f5299d984ce7687c347f"} Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.506720 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d55cc9d0-bc14-45af-978e-d72660ebcda0","Type":"ContainerDied","Data":"b7d8efbddb1d963f3ae560af0d76724a8c8a0941b7a8bf6b9d3ec7f7ffac4127"} Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.506742 4693 scope.go:117] "RemoveContainer" containerID="0954176a47ed9aadcdaa9a1b3c9ee4d75440ddc0c641f5299d984ce7687c347f" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.512845 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d55cc9d0-bc14-45af-978e-d72660ebcda0-config-data" (OuterVolumeSpecName: "config-data") pod "d55cc9d0-bc14-45af-978e-d72660ebcda0" (UID: "d55cc9d0-bc14-45af-978e-d72660ebcda0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.536287 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.367226028 podStartE2EDuration="7.536258057s" podCreationTimestamp="2025-12-04 10:08:31 +0000 UTC" firstStartedPulling="2025-12-04 10:08:32.661087939 +0000 UTC m=+1558.558681692" lastFinishedPulling="2025-12-04 10:08:37.830119968 +0000 UTC m=+1563.727713721" observedRunningTime="2025-12-04 10:08:38.528123302 +0000 UTC m=+1564.425717055" watchObservedRunningTime="2025-12-04 10:08:38.536258057 +0000 UTC m=+1564.433851820" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.547213 4693 scope.go:117] "RemoveContainer" containerID="0954176a47ed9aadcdaa9a1b3c9ee4d75440ddc0c641f5299d984ce7687c347f" Dec 04 10:08:38 crc kubenswrapper[4693]: E1204 10:08:38.547694 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0954176a47ed9aadcdaa9a1b3c9ee4d75440ddc0c641f5299d984ce7687c347f\": container with ID starting with 0954176a47ed9aadcdaa9a1b3c9ee4d75440ddc0c641f5299d984ce7687c347f not found: ID does not exist" containerID="0954176a47ed9aadcdaa9a1b3c9ee4d75440ddc0c641f5299d984ce7687c347f" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.547752 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0954176a47ed9aadcdaa9a1b3c9ee4d75440ddc0c641f5299d984ce7687c347f"} err="failed to get container status \"0954176a47ed9aadcdaa9a1b3c9ee4d75440ddc0c641f5299d984ce7687c347f\": rpc error: code = NotFound desc = could not find container \"0954176a47ed9aadcdaa9a1b3c9ee4d75440ddc0c641f5299d984ce7687c347f\": container with ID starting with 0954176a47ed9aadcdaa9a1b3c9ee4d75440ddc0c641f5299d984ce7687c347f not found: ID does not exist" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.547923 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d55cc9d0-bc14-45af-978e-d72660ebcda0-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.547953 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55cc9d0-bc14-45af-978e-d72660ebcda0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.547968 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lf8pb\" (UniqueName: \"kubernetes.io/projected/d55cc9d0-bc14-45af-978e-d72660ebcda0-kube-api-access-lf8pb\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.735601 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.854813 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.863174 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.890128 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:08:38 crc kubenswrapper[4693]: E1204 10:08:38.890682 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d55cc9d0-bc14-45af-978e-d72660ebcda0" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.890725 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55cc9d0-bc14-45af-978e-d72660ebcda0" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.891010 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d55cc9d0-bc14-45af-978e-d72660ebcda0" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.891887 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.895773 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.896049 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.896236 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.908345 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.955053 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtq8d\" (UniqueName: \"kubernetes.io/projected/bba2b0cd-4556-4a03-a111-d73471571173-kube-api-access-qtq8d\") pod \"nova-cell1-novncproxy-0\" (UID: \"bba2b0cd-4556-4a03-a111-d73471571173\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.955164 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba2b0cd-4556-4a03-a111-d73471571173-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bba2b0cd-4556-4a03-a111-d73471571173\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.955219 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bba2b0cd-4556-4a03-a111-d73471571173-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bba2b0cd-4556-4a03-a111-d73471571173\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.955372 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba2b0cd-4556-4a03-a111-d73471571173-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bba2b0cd-4556-4a03-a111-d73471571173\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:38 crc kubenswrapper[4693]: I1204 10:08:38.955398 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bba2b0cd-4556-4a03-a111-d73471571173-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bba2b0cd-4556-4a03-a111-d73471571173\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:39 crc kubenswrapper[4693]: I1204 10:08:39.058462 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba2b0cd-4556-4a03-a111-d73471571173-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bba2b0cd-4556-4a03-a111-d73471571173\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:39 crc kubenswrapper[4693]: I1204 10:08:39.058761 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bba2b0cd-4556-4a03-a111-d73471571173-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bba2b0cd-4556-4a03-a111-d73471571173\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:39 crc kubenswrapper[4693]: I1204 10:08:39.058854 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtq8d\" (UniqueName: \"kubernetes.io/projected/bba2b0cd-4556-4a03-a111-d73471571173-kube-api-access-qtq8d\") pod \"nova-cell1-novncproxy-0\" (UID: \"bba2b0cd-4556-4a03-a111-d73471571173\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:39 crc kubenswrapper[4693]: I1204 10:08:39.058982 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba2b0cd-4556-4a03-a111-d73471571173-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bba2b0cd-4556-4a03-a111-d73471571173\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:39 crc kubenswrapper[4693]: I1204 10:08:39.059141 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bba2b0cd-4556-4a03-a111-d73471571173-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bba2b0cd-4556-4a03-a111-d73471571173\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:39 crc kubenswrapper[4693]: I1204 10:08:39.064323 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bba2b0cd-4556-4a03-a111-d73471571173-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bba2b0cd-4556-4a03-a111-d73471571173\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:39 crc kubenswrapper[4693]: I1204 10:08:39.073053 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bba2b0cd-4556-4a03-a111-d73471571173-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bba2b0cd-4556-4a03-a111-d73471571173\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:39 crc kubenswrapper[4693]: I1204 10:08:39.073153 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba2b0cd-4556-4a03-a111-d73471571173-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bba2b0cd-4556-4a03-a111-d73471571173\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:39 crc kubenswrapper[4693]: I1204 10:08:39.073856 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba2b0cd-4556-4a03-a111-d73471571173-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bba2b0cd-4556-4a03-a111-d73471571173\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:39 crc kubenswrapper[4693]: I1204 10:08:39.082270 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtq8d\" (UniqueName: \"kubernetes.io/projected/bba2b0cd-4556-4a03-a111-d73471571173-kube-api-access-qtq8d\") pod \"nova-cell1-novncproxy-0\" (UID: \"bba2b0cd-4556-4a03-a111-d73471571173\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:39 crc kubenswrapper[4693]: I1204 10:08:39.234313 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:39 crc kubenswrapper[4693]: I1204 10:08:39.624843 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 10:08:39 crc kubenswrapper[4693]: I1204 10:08:39.625735 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 10:08:39 crc kubenswrapper[4693]: I1204 10:08:39.630405 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 10:08:39 crc kubenswrapper[4693]: I1204 10:08:39.631123 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 10:08:39 crc kubenswrapper[4693]: I1204 10:08:39.728925 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:08:39 crc kubenswrapper[4693]: W1204 10:08:39.736449 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbba2b0cd_4556_4a03_a111_d73471571173.slice/crio-758f7a2df697d30934aaa6b88653fad49d2b8f51aeb5c5f1c1304680e808426d WatchSource:0}: Error finding container 758f7a2df697d30934aaa6b88653fad49d2b8f51aeb5c5f1c1304680e808426d: Status 404 returned error can't find the container with id 758f7a2df697d30934aaa6b88653fad49d2b8f51aeb5c5f1c1304680e808426d Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.485761 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d55cc9d0-bc14-45af-978e-d72660ebcda0" path="/var/lib/kubelet/pods/d55cc9d0-bc14-45af-978e-d72660ebcda0/volumes" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.537487 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bba2b0cd-4556-4a03-a111-d73471571173","Type":"ContainerStarted","Data":"1564ad4d5cf3fc56c818cc41ba93d827f2cc2171cf12c51ad15ec60c6dd2c218"} Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.537540 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bba2b0cd-4556-4a03-a111-d73471571173","Type":"ContainerStarted","Data":"758f7a2df697d30934aaa6b88653fad49d2b8f51aeb5c5f1c1304680e808426d"} Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.537823 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.554117 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.563050 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.563030228 podStartE2EDuration="2.563030228s" podCreationTimestamp="2025-12-04 10:08:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:08:40.554217174 +0000 UTC m=+1566.451810927" watchObservedRunningTime="2025-12-04 10:08:40.563030228 +0000 UTC m=+1566.460623981" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.744930 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5958d5dc75-r2lpz"] Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.747384 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.789393 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5958d5dc75-r2lpz"] Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.796447 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-dns-svc\") pod \"dnsmasq-dns-5958d5dc75-r2lpz\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.796502 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc654\" (UniqueName: \"kubernetes.io/projected/fdebc460-1495-4d6e-9621-3117453dd084-kube-api-access-dc654\") pod \"dnsmasq-dns-5958d5dc75-r2lpz\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.796534 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-dns-swift-storage-0\") pod \"dnsmasq-dns-5958d5dc75-r2lpz\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.796566 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-ovsdbserver-sb\") pod \"dnsmasq-dns-5958d5dc75-r2lpz\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.796605 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-ovsdbserver-nb\") pod \"dnsmasq-dns-5958d5dc75-r2lpz\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.796629 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-config\") pod \"dnsmasq-dns-5958d5dc75-r2lpz\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.899372 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-dns-svc\") pod \"dnsmasq-dns-5958d5dc75-r2lpz\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.899425 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc654\" (UniqueName: \"kubernetes.io/projected/fdebc460-1495-4d6e-9621-3117453dd084-kube-api-access-dc654\") pod \"dnsmasq-dns-5958d5dc75-r2lpz\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.899451 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-dns-swift-storage-0\") pod \"dnsmasq-dns-5958d5dc75-r2lpz\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.899485 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-ovsdbserver-sb\") pod \"dnsmasq-dns-5958d5dc75-r2lpz\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.899517 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-ovsdbserver-nb\") pod \"dnsmasq-dns-5958d5dc75-r2lpz\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.899542 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-config\") pod \"dnsmasq-dns-5958d5dc75-r2lpz\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.900325 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-config\") pod \"dnsmasq-dns-5958d5dc75-r2lpz\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.900972 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-dns-svc\") pod \"dnsmasq-dns-5958d5dc75-r2lpz\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.901766 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-dns-swift-storage-0\") pod \"dnsmasq-dns-5958d5dc75-r2lpz\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.902290 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-ovsdbserver-sb\") pod \"dnsmasq-dns-5958d5dc75-r2lpz\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.902934 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-ovsdbserver-nb\") pod \"dnsmasq-dns-5958d5dc75-r2lpz\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:40 crc kubenswrapper[4693]: I1204 10:08:40.920567 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc654\" (UniqueName: \"kubernetes.io/projected/fdebc460-1495-4d6e-9621-3117453dd084-kube-api-access-dc654\") pod \"dnsmasq-dns-5958d5dc75-r2lpz\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:41 crc kubenswrapper[4693]: I1204 10:08:41.073780 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:41 crc kubenswrapper[4693]: I1204 10:08:41.795959 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tj5xn" Dec 04 10:08:41 crc kubenswrapper[4693]: I1204 10:08:41.796184 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tj5xn" Dec 04 10:08:41 crc kubenswrapper[4693]: I1204 10:08:41.826726 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5958d5dc75-r2lpz"] Dec 04 10:08:41 crc kubenswrapper[4693]: W1204 10:08:41.840583 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdebc460_1495_4d6e_9621_3117453dd084.slice/crio-f26b82d567036128005ed2d0bc3a7a32bc80a44e2b077bc007d2ebdf1bf6d7b6 WatchSource:0}: Error finding container f26b82d567036128005ed2d0bc3a7a32bc80a44e2b077bc007d2ebdf1bf6d7b6: Status 404 returned error can't find the container with id f26b82d567036128005ed2d0bc3a7a32bc80a44e2b077bc007d2ebdf1bf6d7b6 Dec 04 10:08:41 crc kubenswrapper[4693]: I1204 10:08:41.918919 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tj5xn" Dec 04 10:08:42 crc kubenswrapper[4693]: I1204 10:08:42.667046 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:08:42 crc kubenswrapper[4693]: I1204 10:08:42.667591 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerName="ceilometer-central-agent" containerID="cri-o://cde29142c7adfaf3359a1e4fc08ecb4821daab851200c725563f16007b2c3f66" gracePeriod=30 Dec 04 10:08:42 crc kubenswrapper[4693]: I1204 10:08:42.667643 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerName="ceilometer-notification-agent" containerID="cri-o://014f7fc4082950f0e345ec8bd5fdbd7538b67cbcb765e7598305764c041eb545" gracePeriod=30 Dec 04 10:08:42 crc kubenswrapper[4693]: I1204 10:08:42.667663 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerName="sg-core" containerID="cri-o://e8de25da0b606ad52fc02077b8bd5695ff0a7ed808c034c905112349ff9d7b1e" gracePeriod=30 Dec 04 10:08:42 crc kubenswrapper[4693]: I1204 10:08:42.667931 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerName="proxy-httpd" containerID="cri-o://d74f286ed71479d4afcddbf5e4152d692b7017a911c80c55f0c897daf9947146" gracePeriod=30 Dec 04 10:08:42 crc kubenswrapper[4693]: I1204 10:08:42.846410 4693 generic.go:334] "Generic (PLEG): container finished" podID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerID="d74f286ed71479d4afcddbf5e4152d692b7017a911c80c55f0c897daf9947146" exitCode=0 Dec 04 10:08:42 crc kubenswrapper[4693]: I1204 10:08:42.847575 4693 generic.go:334] "Generic (PLEG): container finished" podID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerID="e8de25da0b606ad52fc02077b8bd5695ff0a7ed808c034c905112349ff9d7b1e" exitCode=2 Dec 04 10:08:42 crc kubenswrapper[4693]: I1204 10:08:42.846623 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac5a8f66-35e3-4976-b13a-8bef98f9bae1","Type":"ContainerDied","Data":"d74f286ed71479d4afcddbf5e4152d692b7017a911c80c55f0c897daf9947146"} Dec 04 10:08:42 crc kubenswrapper[4693]: I1204 10:08:42.847674 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac5a8f66-35e3-4976-b13a-8bef98f9bae1","Type":"ContainerDied","Data":"e8de25da0b606ad52fc02077b8bd5695ff0a7ed808c034c905112349ff9d7b1e"} Dec 04 10:08:42 crc kubenswrapper[4693]: I1204 10:08:42.851307 4693 generic.go:334] "Generic (PLEG): container finished" podID="fdebc460-1495-4d6e-9621-3117453dd084" containerID="a9790b8d5d495a4a88d88c13709e4fc136ed5c87047acd9bd461d69d74684ace" exitCode=0 Dec 04 10:08:42 crc kubenswrapper[4693]: I1204 10:08:42.851465 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" event={"ID":"fdebc460-1495-4d6e-9621-3117453dd084","Type":"ContainerDied","Data":"a9790b8d5d495a4a88d88c13709e4fc136ed5c87047acd9bd461d69d74684ace"} Dec 04 10:08:42 crc kubenswrapper[4693]: I1204 10:08:42.851504 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" event={"ID":"fdebc460-1495-4d6e-9621-3117453dd084","Type":"ContainerStarted","Data":"f26b82d567036128005ed2d0bc3a7a32bc80a44e2b077bc007d2ebdf1bf6d7b6"} Dec 04 10:08:42 crc kubenswrapper[4693]: I1204 10:08:42.929959 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tj5xn" Dec 04 10:08:42 crc kubenswrapper[4693]: I1204 10:08:42.986197 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tj5xn"] Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.659079 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.749725 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-log-httpd\") pod \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.749836 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2lm6\" (UniqueName: \"kubernetes.io/projected/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-kube-api-access-j2lm6\") pod \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.749898 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-run-httpd\") pod \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.749938 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-combined-ca-bundle\") pod \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.750012 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-sg-core-conf-yaml\") pod \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.750163 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-config-data\") pod \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.750241 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-ceilometer-tls-certs\") pod \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.750285 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-scripts\") pod \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\" (UID: \"ac5a8f66-35e3-4976-b13a-8bef98f9bae1\") " Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.750401 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ac5a8f66-35e3-4976-b13a-8bef98f9bae1" (UID: "ac5a8f66-35e3-4976-b13a-8bef98f9bae1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.750934 4693 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.750958 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ac5a8f66-35e3-4976-b13a-8bef98f9bae1" (UID: "ac5a8f66-35e3-4976-b13a-8bef98f9bae1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.762542 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-scripts" (OuterVolumeSpecName: "scripts") pod "ac5a8f66-35e3-4976-b13a-8bef98f9bae1" (UID: "ac5a8f66-35e3-4976-b13a-8bef98f9bae1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.762717 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-kube-api-access-j2lm6" (OuterVolumeSpecName: "kube-api-access-j2lm6") pod "ac5a8f66-35e3-4976-b13a-8bef98f9bae1" (UID: "ac5a8f66-35e3-4976-b13a-8bef98f9bae1"). InnerVolumeSpecName "kube-api-access-j2lm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.785767 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.800472 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ac5a8f66-35e3-4976-b13a-8bef98f9bae1" (UID: "ac5a8f66-35e3-4976-b13a-8bef98f9bae1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.831601 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ac5a8f66-35e3-4976-b13a-8bef98f9bae1" (UID: "ac5a8f66-35e3-4976-b13a-8bef98f9bae1"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.852318 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac5a8f66-35e3-4976-b13a-8bef98f9bae1" (UID: "ac5a8f66-35e3-4976-b13a-8bef98f9bae1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.852779 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.852810 4693 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.852822 4693 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.852831 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.852840 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2lm6\" (UniqueName: \"kubernetes.io/projected/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-kube-api-access-j2lm6\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.852849 4693 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.867163 4693 generic.go:334] "Generic (PLEG): container finished" podID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerID="014f7fc4082950f0e345ec8bd5fdbd7538b67cbcb765e7598305764c041eb545" exitCode=0 Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.867196 4693 generic.go:334] "Generic (PLEG): container finished" podID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerID="cde29142c7adfaf3359a1e4fc08ecb4821daab851200c725563f16007b2c3f66" exitCode=0 Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.867243 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac5a8f66-35e3-4976-b13a-8bef98f9bae1","Type":"ContainerDied","Data":"014f7fc4082950f0e345ec8bd5fdbd7538b67cbcb765e7598305764c041eb545"} Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.867473 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac5a8f66-35e3-4976-b13a-8bef98f9bae1","Type":"ContainerDied","Data":"cde29142c7adfaf3359a1e4fc08ecb4821daab851200c725563f16007b2c3f66"} Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.867504 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac5a8f66-35e3-4976-b13a-8bef98f9bae1","Type":"ContainerDied","Data":"23227702628dbcaf94962975e42226b7462b6e1ab8241ac7b9ecb78568d7faa2"} Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.867521 4693 scope.go:117] "RemoveContainer" containerID="d74f286ed71479d4afcddbf5e4152d692b7017a911c80c55f0c897daf9947146" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.867670 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.876950 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" event={"ID":"fdebc460-1495-4d6e-9621-3117453dd084","Type":"ContainerStarted","Data":"0725817d2f68e84f8870049db43ea3f1ccf1b2d1469e70957d39aba6fe1154ed"} Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.877127 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8e4345aa-2c11-4997-ba04-11bf02f29c2f" containerName="nova-api-log" containerID="cri-o://ca3879cfb9a2e7178a9ac9b97772e313b971db30b0c129230d7182e860995a3e" gracePeriod=30 Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.877227 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.877276 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8e4345aa-2c11-4997-ba04-11bf02f29c2f" containerName="nova-api-api" containerID="cri-o://95fc2a134476ba7d14ce63a92396ac4d0bbf14a3107d2b8a6c0011526a46469a" gracePeriod=30 Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.901598 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" podStartSLOduration=3.901578019 podStartE2EDuration="3.901578019s" podCreationTimestamp="2025-12-04 10:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:08:43.898752641 +0000 UTC m=+1569.796346394" watchObservedRunningTime="2025-12-04 10:08:43.901578019 +0000 UTC m=+1569.799171772" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.904401 4693 scope.go:117] "RemoveContainer" containerID="e8de25da0b606ad52fc02077b8bd5695ff0a7ed808c034c905112349ff9d7b1e" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.911082 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-config-data" (OuterVolumeSpecName: "config-data") pod "ac5a8f66-35e3-4976-b13a-8bef98f9bae1" (UID: "ac5a8f66-35e3-4976-b13a-8bef98f9bae1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.935839 4693 scope.go:117] "RemoveContainer" containerID="014f7fc4082950f0e345ec8bd5fdbd7538b67cbcb765e7598305764c041eb545" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.956040 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac5a8f66-35e3-4976-b13a-8bef98f9bae1-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:43 crc kubenswrapper[4693]: I1204 10:08:43.969584 4693 scope.go:117] "RemoveContainer" containerID="cde29142c7adfaf3359a1e4fc08ecb4821daab851200c725563f16007b2c3f66" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.108964 4693 scope.go:117] "RemoveContainer" containerID="d74f286ed71479d4afcddbf5e4152d692b7017a911c80c55f0c897daf9947146" Dec 04 10:08:44 crc kubenswrapper[4693]: E1204 10:08:44.109798 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d74f286ed71479d4afcddbf5e4152d692b7017a911c80c55f0c897daf9947146\": container with ID starting with d74f286ed71479d4afcddbf5e4152d692b7017a911c80c55f0c897daf9947146 not found: ID does not exist" containerID="d74f286ed71479d4afcddbf5e4152d692b7017a911c80c55f0c897daf9947146" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.109858 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d74f286ed71479d4afcddbf5e4152d692b7017a911c80c55f0c897daf9947146"} err="failed to get container status \"d74f286ed71479d4afcddbf5e4152d692b7017a911c80c55f0c897daf9947146\": rpc error: code = NotFound desc = could not find container \"d74f286ed71479d4afcddbf5e4152d692b7017a911c80c55f0c897daf9947146\": container with ID starting with d74f286ed71479d4afcddbf5e4152d692b7017a911c80c55f0c897daf9947146 not found: ID does not exist" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.109889 4693 scope.go:117] "RemoveContainer" containerID="e8de25da0b606ad52fc02077b8bd5695ff0a7ed808c034c905112349ff9d7b1e" Dec 04 10:08:44 crc kubenswrapper[4693]: E1204 10:08:44.110281 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8de25da0b606ad52fc02077b8bd5695ff0a7ed808c034c905112349ff9d7b1e\": container with ID starting with e8de25da0b606ad52fc02077b8bd5695ff0a7ed808c034c905112349ff9d7b1e not found: ID does not exist" containerID="e8de25da0b606ad52fc02077b8bd5695ff0a7ed808c034c905112349ff9d7b1e" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.110321 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8de25da0b606ad52fc02077b8bd5695ff0a7ed808c034c905112349ff9d7b1e"} err="failed to get container status \"e8de25da0b606ad52fc02077b8bd5695ff0a7ed808c034c905112349ff9d7b1e\": rpc error: code = NotFound desc = could not find container \"e8de25da0b606ad52fc02077b8bd5695ff0a7ed808c034c905112349ff9d7b1e\": container with ID starting with e8de25da0b606ad52fc02077b8bd5695ff0a7ed808c034c905112349ff9d7b1e not found: ID does not exist" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.110367 4693 scope.go:117] "RemoveContainer" containerID="014f7fc4082950f0e345ec8bd5fdbd7538b67cbcb765e7598305764c041eb545" Dec 04 10:08:44 crc kubenswrapper[4693]: E1204 10:08:44.110658 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"014f7fc4082950f0e345ec8bd5fdbd7538b67cbcb765e7598305764c041eb545\": container with ID starting with 014f7fc4082950f0e345ec8bd5fdbd7538b67cbcb765e7598305764c041eb545 not found: ID does not exist" containerID="014f7fc4082950f0e345ec8bd5fdbd7538b67cbcb765e7598305764c041eb545" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.110687 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"014f7fc4082950f0e345ec8bd5fdbd7538b67cbcb765e7598305764c041eb545"} err="failed to get container status \"014f7fc4082950f0e345ec8bd5fdbd7538b67cbcb765e7598305764c041eb545\": rpc error: code = NotFound desc = could not find container \"014f7fc4082950f0e345ec8bd5fdbd7538b67cbcb765e7598305764c041eb545\": container with ID starting with 014f7fc4082950f0e345ec8bd5fdbd7538b67cbcb765e7598305764c041eb545 not found: ID does not exist" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.110705 4693 scope.go:117] "RemoveContainer" containerID="cde29142c7adfaf3359a1e4fc08ecb4821daab851200c725563f16007b2c3f66" Dec 04 10:08:44 crc kubenswrapper[4693]: E1204 10:08:44.110931 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cde29142c7adfaf3359a1e4fc08ecb4821daab851200c725563f16007b2c3f66\": container with ID starting with cde29142c7adfaf3359a1e4fc08ecb4821daab851200c725563f16007b2c3f66 not found: ID does not exist" containerID="cde29142c7adfaf3359a1e4fc08ecb4821daab851200c725563f16007b2c3f66" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.110966 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde29142c7adfaf3359a1e4fc08ecb4821daab851200c725563f16007b2c3f66"} err="failed to get container status \"cde29142c7adfaf3359a1e4fc08ecb4821daab851200c725563f16007b2c3f66\": rpc error: code = NotFound desc = could not find container \"cde29142c7adfaf3359a1e4fc08ecb4821daab851200c725563f16007b2c3f66\": container with ID starting with cde29142c7adfaf3359a1e4fc08ecb4821daab851200c725563f16007b2c3f66 not found: ID does not exist" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.110984 4693 scope.go:117] "RemoveContainer" containerID="d74f286ed71479d4afcddbf5e4152d692b7017a911c80c55f0c897daf9947146" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.111259 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d74f286ed71479d4afcddbf5e4152d692b7017a911c80c55f0c897daf9947146"} err="failed to get container status \"d74f286ed71479d4afcddbf5e4152d692b7017a911c80c55f0c897daf9947146\": rpc error: code = NotFound desc = could not find container \"d74f286ed71479d4afcddbf5e4152d692b7017a911c80c55f0c897daf9947146\": container with ID starting with d74f286ed71479d4afcddbf5e4152d692b7017a911c80c55f0c897daf9947146 not found: ID does not exist" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.111388 4693 scope.go:117] "RemoveContainer" containerID="e8de25da0b606ad52fc02077b8bd5695ff0a7ed808c034c905112349ff9d7b1e" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.111614 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8de25da0b606ad52fc02077b8bd5695ff0a7ed808c034c905112349ff9d7b1e"} err="failed to get container status \"e8de25da0b606ad52fc02077b8bd5695ff0a7ed808c034c905112349ff9d7b1e\": rpc error: code = NotFound desc = could not find container \"e8de25da0b606ad52fc02077b8bd5695ff0a7ed808c034c905112349ff9d7b1e\": container with ID starting with e8de25da0b606ad52fc02077b8bd5695ff0a7ed808c034c905112349ff9d7b1e not found: ID does not exist" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.111642 4693 scope.go:117] "RemoveContainer" containerID="014f7fc4082950f0e345ec8bd5fdbd7538b67cbcb765e7598305764c041eb545" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.111953 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"014f7fc4082950f0e345ec8bd5fdbd7538b67cbcb765e7598305764c041eb545"} err="failed to get container status \"014f7fc4082950f0e345ec8bd5fdbd7538b67cbcb765e7598305764c041eb545\": rpc error: code = NotFound desc = could not find container \"014f7fc4082950f0e345ec8bd5fdbd7538b67cbcb765e7598305764c041eb545\": container with ID starting with 014f7fc4082950f0e345ec8bd5fdbd7538b67cbcb765e7598305764c041eb545 not found: ID does not exist" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.112030 4693 scope.go:117] "RemoveContainer" containerID="cde29142c7adfaf3359a1e4fc08ecb4821daab851200c725563f16007b2c3f66" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.112609 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde29142c7adfaf3359a1e4fc08ecb4821daab851200c725563f16007b2c3f66"} err="failed to get container status \"cde29142c7adfaf3359a1e4fc08ecb4821daab851200c725563f16007b2c3f66\": rpc error: code = NotFound desc = could not find container \"cde29142c7adfaf3359a1e4fc08ecb4821daab851200c725563f16007b2c3f66\": container with ID starting with cde29142c7adfaf3359a1e4fc08ecb4821daab851200c725563f16007b2c3f66 not found: ID does not exist" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.205260 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.233211 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.250267 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.261998 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:08:44 crc kubenswrapper[4693]: E1204 10:08:44.263718 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerName="sg-core" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.263747 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerName="sg-core" Dec 04 10:08:44 crc kubenswrapper[4693]: E1204 10:08:44.263788 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerName="proxy-httpd" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.263797 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerName="proxy-httpd" Dec 04 10:08:44 crc kubenswrapper[4693]: E1204 10:08:44.263827 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerName="ceilometer-notification-agent" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.263837 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerName="ceilometer-notification-agent" Dec 04 10:08:44 crc kubenswrapper[4693]: E1204 10:08:44.263854 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerName="ceilometer-central-agent" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.263864 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerName="ceilometer-central-agent" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.264430 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerName="sg-core" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.264474 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerName="proxy-httpd" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.264497 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerName="ceilometer-central-agent" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.264514 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" containerName="ceilometer-notification-agent" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.270699 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.274713 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.274951 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.275169 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.298675 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.365400 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0123f593-4acf-4645-83e0-dbea0580023b-log-httpd\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.365555 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.366144 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.366190 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkz4b\" (UniqueName: \"kubernetes.io/projected/0123f593-4acf-4645-83e0-dbea0580023b-kube-api-access-vkz4b\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.366298 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.366362 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-config-data\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.366434 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-scripts\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.366456 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0123f593-4acf-4645-83e0-dbea0580023b-run-httpd\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.468214 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-scripts\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.468270 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0123f593-4acf-4645-83e0-dbea0580023b-run-httpd\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.468372 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0123f593-4acf-4645-83e0-dbea0580023b-log-httpd\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.468428 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.468456 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.468480 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkz4b\" (UniqueName: \"kubernetes.io/projected/0123f593-4acf-4645-83e0-dbea0580023b-kube-api-access-vkz4b\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.468536 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.468566 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-config-data\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.473125 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0123f593-4acf-4645-83e0-dbea0580023b-run-httpd\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.473257 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-scripts\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.473583 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac5a8f66-35e3-4976-b13a-8bef98f9bae1" path="/var/lib/kubelet/pods/ac5a8f66-35e3-4976-b13a-8bef98f9bae1/volumes" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.473955 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.476743 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0123f593-4acf-4645-83e0-dbea0580023b-log-httpd\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.477526 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-config-data\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.478134 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.517264 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.527237 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkz4b\" (UniqueName: \"kubernetes.io/projected/0123f593-4acf-4645-83e0-dbea0580023b-kube-api-access-vkz4b\") pod \"ceilometer-0\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.597519 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.895024 4693 generic.go:334] "Generic (PLEG): container finished" podID="8e4345aa-2c11-4997-ba04-11bf02f29c2f" containerID="ca3879cfb9a2e7178a9ac9b97772e313b971db30b0c129230d7182e860995a3e" exitCode=143 Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.895452 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8e4345aa-2c11-4997-ba04-11bf02f29c2f","Type":"ContainerDied","Data":"ca3879cfb9a2e7178a9ac9b97772e313b971db30b0c129230d7182e860995a3e"} Dec 04 10:08:44 crc kubenswrapper[4693]: I1204 10:08:44.896908 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tj5xn" podUID="e59447c1-4fc2-4413-a040-8b71e5b10885" containerName="registry-server" containerID="cri-o://42ee71988c018ccd36b935a5486fdb7296fdd1f3519dc53541965d322759f56d" gracePeriod=2 Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.081113 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:08:45 crc kubenswrapper[4693]: E1204 10:08:45.168323 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode59447c1_4fc2_4413_a040_8b71e5b10885.slice/crio-42ee71988c018ccd36b935a5486fdb7296fdd1f3519dc53541965d322759f56d.scope\": RecentStats: unable to find data in memory cache]" Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.432398 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tj5xn" Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.488160 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59447c1-4fc2-4413-a040-8b71e5b10885-utilities\") pod \"e59447c1-4fc2-4413-a040-8b71e5b10885\" (UID: \"e59447c1-4fc2-4413-a040-8b71e5b10885\") " Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.488535 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59447c1-4fc2-4413-a040-8b71e5b10885-catalog-content\") pod \"e59447c1-4fc2-4413-a040-8b71e5b10885\" (UID: \"e59447c1-4fc2-4413-a040-8b71e5b10885\") " Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.488595 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dflm9\" (UniqueName: \"kubernetes.io/projected/e59447c1-4fc2-4413-a040-8b71e5b10885-kube-api-access-dflm9\") pod \"e59447c1-4fc2-4413-a040-8b71e5b10885\" (UID: \"e59447c1-4fc2-4413-a040-8b71e5b10885\") " Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.489384 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e59447c1-4fc2-4413-a040-8b71e5b10885-utilities" (OuterVolumeSpecName: "utilities") pod "e59447c1-4fc2-4413-a040-8b71e5b10885" (UID: "e59447c1-4fc2-4413-a040-8b71e5b10885"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.495036 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e59447c1-4fc2-4413-a040-8b71e5b10885-kube-api-access-dflm9" (OuterVolumeSpecName: "kube-api-access-dflm9") pod "e59447c1-4fc2-4413-a040-8b71e5b10885" (UID: "e59447c1-4fc2-4413-a040-8b71e5b10885"). InnerVolumeSpecName "kube-api-access-dflm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.546378 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e59447c1-4fc2-4413-a040-8b71e5b10885-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e59447c1-4fc2-4413-a040-8b71e5b10885" (UID: "e59447c1-4fc2-4413-a040-8b71e5b10885"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.591478 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59447c1-4fc2-4413-a040-8b71e5b10885-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.591511 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dflm9\" (UniqueName: \"kubernetes.io/projected/e59447c1-4fc2-4413-a040-8b71e5b10885-kube-api-access-dflm9\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.591523 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59447c1-4fc2-4413-a040-8b71e5b10885-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.906518 4693 generic.go:334] "Generic (PLEG): container finished" podID="e59447c1-4fc2-4413-a040-8b71e5b10885" containerID="42ee71988c018ccd36b935a5486fdb7296fdd1f3519dc53541965d322759f56d" exitCode=0 Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.906587 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tj5xn" Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.906607 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tj5xn" event={"ID":"e59447c1-4fc2-4413-a040-8b71e5b10885","Type":"ContainerDied","Data":"42ee71988c018ccd36b935a5486fdb7296fdd1f3519dc53541965d322759f56d"} Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.907702 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tj5xn" event={"ID":"e59447c1-4fc2-4413-a040-8b71e5b10885","Type":"ContainerDied","Data":"b4df9ddf138023a426041ccda68944e73ddd7134973d1d05bcebc6d3b57c0f0a"} Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.907735 4693 scope.go:117] "RemoveContainer" containerID="42ee71988c018ccd36b935a5486fdb7296fdd1f3519dc53541965d322759f56d" Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.909751 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0123f593-4acf-4645-83e0-dbea0580023b","Type":"ContainerStarted","Data":"a624776e6233eb66a369870145960064fca8f704ca7eb0ff27b7a2385a22ea40"} Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.909779 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0123f593-4acf-4645-83e0-dbea0580023b","Type":"ContainerStarted","Data":"289650d7e2fe85f24011b8931249579748f59c0e1332243343f9a70271025da1"} Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.928620 4693 scope.go:117] "RemoveContainer" containerID="9879f3a20a34c62137bc929161ee97db701e3f45af077656557dc4c9e6013d80" Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.953248 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tj5xn"] Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.963755 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tj5xn"] Dec 04 10:08:45 crc kubenswrapper[4693]: I1204 10:08:45.979528 4693 scope.go:117] "RemoveContainer" containerID="45373636ff9c9484cc419b075508812ecc6829cf541d89c4e62a18b3986d0ff2" Dec 04 10:08:46 crc kubenswrapper[4693]: I1204 10:08:46.017244 4693 scope.go:117] "RemoveContainer" containerID="42ee71988c018ccd36b935a5486fdb7296fdd1f3519dc53541965d322759f56d" Dec 04 10:08:46 crc kubenswrapper[4693]: E1204 10:08:46.017740 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42ee71988c018ccd36b935a5486fdb7296fdd1f3519dc53541965d322759f56d\": container with ID starting with 42ee71988c018ccd36b935a5486fdb7296fdd1f3519dc53541965d322759f56d not found: ID does not exist" containerID="42ee71988c018ccd36b935a5486fdb7296fdd1f3519dc53541965d322759f56d" Dec 04 10:08:46 crc kubenswrapper[4693]: I1204 10:08:46.017780 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ee71988c018ccd36b935a5486fdb7296fdd1f3519dc53541965d322759f56d"} err="failed to get container status \"42ee71988c018ccd36b935a5486fdb7296fdd1f3519dc53541965d322759f56d\": rpc error: code = NotFound desc = could not find container \"42ee71988c018ccd36b935a5486fdb7296fdd1f3519dc53541965d322759f56d\": container with ID starting with 42ee71988c018ccd36b935a5486fdb7296fdd1f3519dc53541965d322759f56d not found: ID does not exist" Dec 04 10:08:46 crc kubenswrapper[4693]: I1204 10:08:46.017809 4693 scope.go:117] "RemoveContainer" containerID="9879f3a20a34c62137bc929161ee97db701e3f45af077656557dc4c9e6013d80" Dec 04 10:08:46 crc kubenswrapper[4693]: E1204 10:08:46.018119 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9879f3a20a34c62137bc929161ee97db701e3f45af077656557dc4c9e6013d80\": container with ID starting with 9879f3a20a34c62137bc929161ee97db701e3f45af077656557dc4c9e6013d80 not found: ID does not exist" containerID="9879f3a20a34c62137bc929161ee97db701e3f45af077656557dc4c9e6013d80" Dec 04 10:08:46 crc kubenswrapper[4693]: I1204 10:08:46.018144 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9879f3a20a34c62137bc929161ee97db701e3f45af077656557dc4c9e6013d80"} err="failed to get container status \"9879f3a20a34c62137bc929161ee97db701e3f45af077656557dc4c9e6013d80\": rpc error: code = NotFound desc = could not find container \"9879f3a20a34c62137bc929161ee97db701e3f45af077656557dc4c9e6013d80\": container with ID starting with 9879f3a20a34c62137bc929161ee97db701e3f45af077656557dc4c9e6013d80 not found: ID does not exist" Dec 04 10:08:46 crc kubenswrapper[4693]: I1204 10:08:46.018160 4693 scope.go:117] "RemoveContainer" containerID="45373636ff9c9484cc419b075508812ecc6829cf541d89c4e62a18b3986d0ff2" Dec 04 10:08:46 crc kubenswrapper[4693]: E1204 10:08:46.018616 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45373636ff9c9484cc419b075508812ecc6829cf541d89c4e62a18b3986d0ff2\": container with ID starting with 45373636ff9c9484cc419b075508812ecc6829cf541d89c4e62a18b3986d0ff2 not found: ID does not exist" containerID="45373636ff9c9484cc419b075508812ecc6829cf541d89c4e62a18b3986d0ff2" Dec 04 10:08:46 crc kubenswrapper[4693]: I1204 10:08:46.018658 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45373636ff9c9484cc419b075508812ecc6829cf541d89c4e62a18b3986d0ff2"} err="failed to get container status \"45373636ff9c9484cc419b075508812ecc6829cf541d89c4e62a18b3986d0ff2\": rpc error: code = NotFound desc = could not find container \"45373636ff9c9484cc419b075508812ecc6829cf541d89c4e62a18b3986d0ff2\": container with ID starting with 45373636ff9c9484cc419b075508812ecc6829cf541d89c4e62a18b3986d0ff2 not found: ID does not exist" Dec 04 10:08:46 crc kubenswrapper[4693]: I1204 10:08:46.476034 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e59447c1-4fc2-4413-a040-8b71e5b10885" path="/var/lib/kubelet/pods/e59447c1-4fc2-4413-a040-8b71e5b10885/volumes" Dec 04 10:08:46 crc kubenswrapper[4693]: I1204 10:08:46.569027 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:08:46 crc kubenswrapper[4693]: I1204 10:08:46.922065 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0123f593-4acf-4645-83e0-dbea0580023b","Type":"ContainerStarted","Data":"df5661e5faf7eae0545daf2f499c13399fc8a40b7acfb8d9fb62b68dafb0589f"} Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.578602 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.631084 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e4345aa-2c11-4997-ba04-11bf02f29c2f-logs\") pod \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\" (UID: \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\") " Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.631196 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e4345aa-2c11-4997-ba04-11bf02f29c2f-combined-ca-bundle\") pod \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\" (UID: \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\") " Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.631271 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e4345aa-2c11-4997-ba04-11bf02f29c2f-config-data\") pod \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\" (UID: \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\") " Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.631365 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b24d\" (UniqueName: \"kubernetes.io/projected/8e4345aa-2c11-4997-ba04-11bf02f29c2f-kube-api-access-6b24d\") pod \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\" (UID: \"8e4345aa-2c11-4997-ba04-11bf02f29c2f\") " Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.631965 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e4345aa-2c11-4997-ba04-11bf02f29c2f-logs" (OuterVolumeSpecName: "logs") pod "8e4345aa-2c11-4997-ba04-11bf02f29c2f" (UID: "8e4345aa-2c11-4997-ba04-11bf02f29c2f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.632441 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e4345aa-2c11-4997-ba04-11bf02f29c2f-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.645700 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e4345aa-2c11-4997-ba04-11bf02f29c2f-kube-api-access-6b24d" (OuterVolumeSpecName: "kube-api-access-6b24d") pod "8e4345aa-2c11-4997-ba04-11bf02f29c2f" (UID: "8e4345aa-2c11-4997-ba04-11bf02f29c2f"). InnerVolumeSpecName "kube-api-access-6b24d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.678554 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e4345aa-2c11-4997-ba04-11bf02f29c2f-config-data" (OuterVolumeSpecName: "config-data") pod "8e4345aa-2c11-4997-ba04-11bf02f29c2f" (UID: "8e4345aa-2c11-4997-ba04-11bf02f29c2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.682641 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e4345aa-2c11-4997-ba04-11bf02f29c2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e4345aa-2c11-4997-ba04-11bf02f29c2f" (UID: "8e4345aa-2c11-4997-ba04-11bf02f29c2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.734953 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e4345aa-2c11-4997-ba04-11bf02f29c2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.734992 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e4345aa-2c11-4997-ba04-11bf02f29c2f-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.735004 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b24d\" (UniqueName: \"kubernetes.io/projected/8e4345aa-2c11-4997-ba04-11bf02f29c2f-kube-api-access-6b24d\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.933506 4693 generic.go:334] "Generic (PLEG): container finished" podID="8e4345aa-2c11-4997-ba04-11bf02f29c2f" containerID="95fc2a134476ba7d14ce63a92396ac4d0bbf14a3107d2b8a6c0011526a46469a" exitCode=0 Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.933569 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8e4345aa-2c11-4997-ba04-11bf02f29c2f","Type":"ContainerDied","Data":"95fc2a134476ba7d14ce63a92396ac4d0bbf14a3107d2b8a6c0011526a46469a"} Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.933593 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8e4345aa-2c11-4997-ba04-11bf02f29c2f","Type":"ContainerDied","Data":"1fa0579755712c1e0f1f7904e42d460d2135e90f04aa05c578970ebb4de29806"} Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.933610 4693 scope.go:117] "RemoveContainer" containerID="95fc2a134476ba7d14ce63a92396ac4d0bbf14a3107d2b8a6c0011526a46469a" Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.933723 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.947404 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0123f593-4acf-4645-83e0-dbea0580023b","Type":"ContainerStarted","Data":"81e87075d58bbf2789d2cd1851b2d28684e1fefe6943809ab5d065f17b259479"} Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.974559 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:08:47 crc kubenswrapper[4693]: I1204 10:08:47.981275 4693 scope.go:117] "RemoveContainer" containerID="ca3879cfb9a2e7178a9ac9b97772e313b971db30b0c129230d7182e860995a3e" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.001509 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.015396 4693 scope.go:117] "RemoveContainer" containerID="95fc2a134476ba7d14ce63a92396ac4d0bbf14a3107d2b8a6c0011526a46469a" Dec 04 10:08:48 crc kubenswrapper[4693]: E1204 10:08:48.017934 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95fc2a134476ba7d14ce63a92396ac4d0bbf14a3107d2b8a6c0011526a46469a\": container with ID starting with 95fc2a134476ba7d14ce63a92396ac4d0bbf14a3107d2b8a6c0011526a46469a not found: ID does not exist" containerID="95fc2a134476ba7d14ce63a92396ac4d0bbf14a3107d2b8a6c0011526a46469a" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.017986 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95fc2a134476ba7d14ce63a92396ac4d0bbf14a3107d2b8a6c0011526a46469a"} err="failed to get container status \"95fc2a134476ba7d14ce63a92396ac4d0bbf14a3107d2b8a6c0011526a46469a\": rpc error: code = NotFound desc = could not find container \"95fc2a134476ba7d14ce63a92396ac4d0bbf14a3107d2b8a6c0011526a46469a\": container with ID starting with 95fc2a134476ba7d14ce63a92396ac4d0bbf14a3107d2b8a6c0011526a46469a not found: ID does not exist" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.018013 4693 scope.go:117] "RemoveContainer" containerID="ca3879cfb9a2e7178a9ac9b97772e313b971db30b0c129230d7182e860995a3e" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.025659 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 10:08:48 crc kubenswrapper[4693]: E1204 10:08:48.026220 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59447c1-4fc2-4413-a040-8b71e5b10885" containerName="extract-content" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.026237 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59447c1-4fc2-4413-a040-8b71e5b10885" containerName="extract-content" Dec 04 10:08:48 crc kubenswrapper[4693]: E1204 10:08:48.026286 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59447c1-4fc2-4413-a040-8b71e5b10885" containerName="registry-server" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.026294 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59447c1-4fc2-4413-a040-8b71e5b10885" containerName="registry-server" Dec 04 10:08:48 crc kubenswrapper[4693]: E1204 10:08:48.026315 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59447c1-4fc2-4413-a040-8b71e5b10885" containerName="extract-utilities" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.026321 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59447c1-4fc2-4413-a040-8b71e5b10885" containerName="extract-utilities" Dec 04 10:08:48 crc kubenswrapper[4693]: E1204 10:08:48.026347 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e4345aa-2c11-4997-ba04-11bf02f29c2f" containerName="nova-api-log" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.026353 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e4345aa-2c11-4997-ba04-11bf02f29c2f" containerName="nova-api-log" Dec 04 10:08:48 crc kubenswrapper[4693]: E1204 10:08:48.026362 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e4345aa-2c11-4997-ba04-11bf02f29c2f" containerName="nova-api-api" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.026367 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e4345aa-2c11-4997-ba04-11bf02f29c2f" containerName="nova-api-api" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.026581 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e4345aa-2c11-4997-ba04-11bf02f29c2f" containerName="nova-api-api" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.026603 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="e59447c1-4fc2-4413-a040-8b71e5b10885" containerName="registry-server" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.026629 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e4345aa-2c11-4997-ba04-11bf02f29c2f" containerName="nova-api-log" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.027727 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.030824 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.031045 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.031372 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.037520 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.043128 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-config-data\") pod \"nova-api-0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.043214 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.043391 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-public-tls-certs\") pod \"nova-api-0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.043429 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd62v\" (UniqueName: \"kubernetes.io/projected/53f4ea25-b785-4d6d-a740-7f0bf798fba0-kube-api-access-dd62v\") pod \"nova-api-0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.043511 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.043540 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53f4ea25-b785-4d6d-a740-7f0bf798fba0-logs\") pod \"nova-api-0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: E1204 10:08:48.051196 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca3879cfb9a2e7178a9ac9b97772e313b971db30b0c129230d7182e860995a3e\": container with ID starting with ca3879cfb9a2e7178a9ac9b97772e313b971db30b0c129230d7182e860995a3e not found: ID does not exist" containerID="ca3879cfb9a2e7178a9ac9b97772e313b971db30b0c129230d7182e860995a3e" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.051612 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca3879cfb9a2e7178a9ac9b97772e313b971db30b0c129230d7182e860995a3e"} err="failed to get container status \"ca3879cfb9a2e7178a9ac9b97772e313b971db30b0c129230d7182e860995a3e\": rpc error: code = NotFound desc = could not find container \"ca3879cfb9a2e7178a9ac9b97772e313b971db30b0c129230d7182e860995a3e\": container with ID starting with ca3879cfb9a2e7178a9ac9b97772e313b971db30b0c129230d7182e860995a3e not found: ID does not exist" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.145123 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-public-tls-certs\") pod \"nova-api-0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.145179 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd62v\" (UniqueName: \"kubernetes.io/projected/53f4ea25-b785-4d6d-a740-7f0bf798fba0-kube-api-access-dd62v\") pod \"nova-api-0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.145234 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.145259 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53f4ea25-b785-4d6d-a740-7f0bf798fba0-logs\") pod \"nova-api-0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.145399 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-config-data\") pod \"nova-api-0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.145444 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.150955 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53f4ea25-b785-4d6d-a740-7f0bf798fba0-logs\") pod \"nova-api-0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.151385 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.153989 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.155027 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-config-data\") pod \"nova-api-0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.162315 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-public-tls-certs\") pod \"nova-api-0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.183274 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd62v\" (UniqueName: \"kubernetes.io/projected/53f4ea25-b785-4d6d-a740-7f0bf798fba0-kube-api-access-dd62v\") pod \"nova-api-0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.377181 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.496474 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e4345aa-2c11-4997-ba04-11bf02f29c2f" path="/var/lib/kubelet/pods/8e4345aa-2c11-4997-ba04-11bf02f29c2f/volumes" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.948981 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.968529 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0123f593-4acf-4645-83e0-dbea0580023b","Type":"ContainerStarted","Data":"edd322935d0415c9d51e340aa8938298ff3c386bedcc9bbb3d06a6caa165387b"} Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.969535 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0123f593-4acf-4645-83e0-dbea0580023b" containerName="ceilometer-central-agent" containerID="cri-o://a624776e6233eb66a369870145960064fca8f704ca7eb0ff27b7a2385a22ea40" gracePeriod=30 Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.969605 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0123f593-4acf-4645-83e0-dbea0580023b" containerName="ceilometer-notification-agent" containerID="cri-o://df5661e5faf7eae0545daf2f499c13399fc8a40b7acfb8d9fb62b68dafb0589f" gracePeriod=30 Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.969657 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0123f593-4acf-4645-83e0-dbea0580023b" containerName="proxy-httpd" containerID="cri-o://edd322935d0415c9d51e340aa8938298ff3c386bedcc9bbb3d06a6caa165387b" gracePeriod=30 Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.969603 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0123f593-4acf-4645-83e0-dbea0580023b" containerName="sg-core" containerID="cri-o://81e87075d58bbf2789d2cd1851b2d28684e1fefe6943809ab5d065f17b259479" gracePeriod=30 Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.969805 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:08:48 crc kubenswrapper[4693]: I1204 10:08:48.995890 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8094412549999999 podStartE2EDuration="4.995866473s" podCreationTimestamp="2025-12-04 10:08:44 +0000 UTC" firstStartedPulling="2025-12-04 10:08:45.088669473 +0000 UTC m=+1570.986263226" lastFinishedPulling="2025-12-04 10:08:48.275094691 +0000 UTC m=+1574.172688444" observedRunningTime="2025-12-04 10:08:48.987743059 +0000 UTC m=+1574.885336822" watchObservedRunningTime="2025-12-04 10:08:48.995866473 +0000 UTC m=+1574.893460216" Dec 04 10:08:49 crc kubenswrapper[4693]: I1204 10:08:49.235941 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:49 crc kubenswrapper[4693]: I1204 10:08:49.261381 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:49 crc kubenswrapper[4693]: I1204 10:08:49.979684 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"53f4ea25-b785-4d6d-a740-7f0bf798fba0","Type":"ContainerStarted","Data":"e82df1afb12221d8a7d8e63b9675df9bdcabd3683c5c66db219972cd63432172"} Dec 04 10:08:49 crc kubenswrapper[4693]: I1204 10:08:49.979997 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"53f4ea25-b785-4d6d-a740-7f0bf798fba0","Type":"ContainerStarted","Data":"cde263828393acff06af27a46e3b573fbf1d695fbdde1e406ea404f77d2ea042"} Dec 04 10:08:49 crc kubenswrapper[4693]: I1204 10:08:49.980013 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"53f4ea25-b785-4d6d-a740-7f0bf798fba0","Type":"ContainerStarted","Data":"a61ce5aab7565cd83707277cb637634414f625f2f2e7d049f814b5f03d188347"} Dec 04 10:08:49 crc kubenswrapper[4693]: I1204 10:08:49.984915 4693 generic.go:334] "Generic (PLEG): container finished" podID="0123f593-4acf-4645-83e0-dbea0580023b" containerID="edd322935d0415c9d51e340aa8938298ff3c386bedcc9bbb3d06a6caa165387b" exitCode=0 Dec 04 10:08:49 crc kubenswrapper[4693]: I1204 10:08:49.984944 4693 generic.go:334] "Generic (PLEG): container finished" podID="0123f593-4acf-4645-83e0-dbea0580023b" containerID="81e87075d58bbf2789d2cd1851b2d28684e1fefe6943809ab5d065f17b259479" exitCode=2 Dec 04 10:08:49 crc kubenswrapper[4693]: I1204 10:08:49.984952 4693 generic.go:334] "Generic (PLEG): container finished" podID="0123f593-4acf-4645-83e0-dbea0580023b" containerID="df5661e5faf7eae0545daf2f499c13399fc8a40b7acfb8d9fb62b68dafb0589f" exitCode=0 Dec 04 10:08:49 crc kubenswrapper[4693]: I1204 10:08:49.984997 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0123f593-4acf-4645-83e0-dbea0580023b","Type":"ContainerDied","Data":"edd322935d0415c9d51e340aa8938298ff3c386bedcc9bbb3d06a6caa165387b"} Dec 04 10:08:49 crc kubenswrapper[4693]: I1204 10:08:49.985043 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0123f593-4acf-4645-83e0-dbea0580023b","Type":"ContainerDied","Data":"81e87075d58bbf2789d2cd1851b2d28684e1fefe6943809ab5d065f17b259479"} Dec 04 10:08:49 crc kubenswrapper[4693]: I1204 10:08:49.985053 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0123f593-4acf-4645-83e0-dbea0580023b","Type":"ContainerDied","Data":"df5661e5faf7eae0545daf2f499c13399fc8a40b7acfb8d9fb62b68dafb0589f"} Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.007661 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.007643995 podStartE2EDuration="3.007643995s" podCreationTimestamp="2025-12-04 10:08:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:08:50.006808622 +0000 UTC m=+1575.904402375" watchObservedRunningTime="2025-12-04 10:08:50.007643995 +0000 UTC m=+1575.905237758" Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.017959 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.151956 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-pp9r5"] Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.158054 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pp9r5" Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.160697 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.160793 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.166734 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pp9r5"] Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.295408 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f223e615-c099-4f70-b613-3f438d533326-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pp9r5\" (UID: \"f223e615-c099-4f70-b613-3f438d533326\") " pod="openstack/nova-cell1-cell-mapping-pp9r5" Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.295874 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f223e615-c099-4f70-b613-3f438d533326-config-data\") pod \"nova-cell1-cell-mapping-pp9r5\" (UID: \"f223e615-c099-4f70-b613-3f438d533326\") " pod="openstack/nova-cell1-cell-mapping-pp9r5" Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.296317 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkrck\" (UniqueName: \"kubernetes.io/projected/f223e615-c099-4f70-b613-3f438d533326-kube-api-access-wkrck\") pod \"nova-cell1-cell-mapping-pp9r5\" (UID: \"f223e615-c099-4f70-b613-3f438d533326\") " pod="openstack/nova-cell1-cell-mapping-pp9r5" Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.296591 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f223e615-c099-4f70-b613-3f438d533326-scripts\") pod \"nova-cell1-cell-mapping-pp9r5\" (UID: \"f223e615-c099-4f70-b613-3f438d533326\") " pod="openstack/nova-cell1-cell-mapping-pp9r5" Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.398598 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkrck\" (UniqueName: \"kubernetes.io/projected/f223e615-c099-4f70-b613-3f438d533326-kube-api-access-wkrck\") pod \"nova-cell1-cell-mapping-pp9r5\" (UID: \"f223e615-c099-4f70-b613-3f438d533326\") " pod="openstack/nova-cell1-cell-mapping-pp9r5" Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.398722 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f223e615-c099-4f70-b613-3f438d533326-scripts\") pod \"nova-cell1-cell-mapping-pp9r5\" (UID: \"f223e615-c099-4f70-b613-3f438d533326\") " pod="openstack/nova-cell1-cell-mapping-pp9r5" Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.399660 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f223e615-c099-4f70-b613-3f438d533326-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pp9r5\" (UID: \"f223e615-c099-4f70-b613-3f438d533326\") " pod="openstack/nova-cell1-cell-mapping-pp9r5" Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.399771 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f223e615-c099-4f70-b613-3f438d533326-config-data\") pod \"nova-cell1-cell-mapping-pp9r5\" (UID: \"f223e615-c099-4f70-b613-3f438d533326\") " pod="openstack/nova-cell1-cell-mapping-pp9r5" Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.408093 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f223e615-c099-4f70-b613-3f438d533326-scripts\") pod \"nova-cell1-cell-mapping-pp9r5\" (UID: \"f223e615-c099-4f70-b613-3f438d533326\") " pod="openstack/nova-cell1-cell-mapping-pp9r5" Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.408224 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f223e615-c099-4f70-b613-3f438d533326-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pp9r5\" (UID: \"f223e615-c099-4f70-b613-3f438d533326\") " pod="openstack/nova-cell1-cell-mapping-pp9r5" Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.408699 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f223e615-c099-4f70-b613-3f438d533326-config-data\") pod \"nova-cell1-cell-mapping-pp9r5\" (UID: \"f223e615-c099-4f70-b613-3f438d533326\") " pod="openstack/nova-cell1-cell-mapping-pp9r5" Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.416788 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkrck\" (UniqueName: \"kubernetes.io/projected/f223e615-c099-4f70-b613-3f438d533326-kube-api-access-wkrck\") pod \"nova-cell1-cell-mapping-pp9r5\" (UID: \"f223e615-c099-4f70-b613-3f438d533326\") " pod="openstack/nova-cell1-cell-mapping-pp9r5" Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.487252 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pp9r5" Dec 04 10:08:50 crc kubenswrapper[4693]: I1204 10:08:50.976133 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pp9r5"] Dec 04 10:08:51 crc kubenswrapper[4693]: I1204 10:08:51.075590 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:08:51 crc kubenswrapper[4693]: I1204 10:08:51.149186 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-858594bc89-xx8tz"] Dec 04 10:08:51 crc kubenswrapper[4693]: I1204 10:08:51.149679 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-858594bc89-xx8tz" podUID="22bd0022-0cff-49e8-96c0-ef6334718552" containerName="dnsmasq-dns" containerID="cri-o://386ce87ef53f449cb8ab5a574e316340291000e5bc74d3c48282243b9de714db" gracePeriod=10 Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.014534 4693 generic.go:334] "Generic (PLEG): container finished" podID="0123f593-4acf-4645-83e0-dbea0580023b" containerID="a624776e6233eb66a369870145960064fca8f704ca7eb0ff27b7a2385a22ea40" exitCode=0 Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.014703 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0123f593-4acf-4645-83e0-dbea0580023b","Type":"ContainerDied","Data":"a624776e6233eb66a369870145960064fca8f704ca7eb0ff27b7a2385a22ea40"} Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.017914 4693 generic.go:334] "Generic (PLEG): container finished" podID="22bd0022-0cff-49e8-96c0-ef6334718552" containerID="386ce87ef53f449cb8ab5a574e316340291000e5bc74d3c48282243b9de714db" exitCode=0 Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.018036 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-858594bc89-xx8tz" event={"ID":"22bd0022-0cff-49e8-96c0-ef6334718552","Type":"ContainerDied","Data":"386ce87ef53f449cb8ab5a574e316340291000e5bc74d3c48282243b9de714db"} Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.020190 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pp9r5" event={"ID":"f223e615-c099-4f70-b613-3f438d533326","Type":"ContainerStarted","Data":"c530c346a3e1f67d5d8af849cfd15ff925733b6c421d92b1d21044479d983558"} Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.020226 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pp9r5" event={"ID":"f223e615-c099-4f70-b613-3f438d533326","Type":"ContainerStarted","Data":"2dc94492243434980426891ecc04986a1f00d15e557cc4aa31eae3ba4df501b4"} Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.049644 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-pp9r5" podStartSLOduration=2.049623616 podStartE2EDuration="2.049623616s" podCreationTimestamp="2025-12-04 10:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:08:52.042729426 +0000 UTC m=+1577.940323189" watchObservedRunningTime="2025-12-04 10:08:52.049623616 +0000 UTC m=+1577.947217369" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.159151 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.249134 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-dns-svc\") pod \"22bd0022-0cff-49e8-96c0-ef6334718552\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.249246 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-ovsdbserver-nb\") pod \"22bd0022-0cff-49e8-96c0-ef6334718552\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.249305 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-config\") pod \"22bd0022-0cff-49e8-96c0-ef6334718552\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.249324 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggwws\" (UniqueName: \"kubernetes.io/projected/22bd0022-0cff-49e8-96c0-ef6334718552-kube-api-access-ggwws\") pod \"22bd0022-0cff-49e8-96c0-ef6334718552\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.249368 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-ovsdbserver-sb\") pod \"22bd0022-0cff-49e8-96c0-ef6334718552\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.249505 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-dns-swift-storage-0\") pod \"22bd0022-0cff-49e8-96c0-ef6334718552\" (UID: \"22bd0022-0cff-49e8-96c0-ef6334718552\") " Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.255243 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22bd0022-0cff-49e8-96c0-ef6334718552-kube-api-access-ggwws" (OuterVolumeSpecName: "kube-api-access-ggwws") pod "22bd0022-0cff-49e8-96c0-ef6334718552" (UID: "22bd0022-0cff-49e8-96c0-ef6334718552"). InnerVolumeSpecName "kube-api-access-ggwws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.304019 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-config" (OuterVolumeSpecName: "config") pod "22bd0022-0cff-49e8-96c0-ef6334718552" (UID: "22bd0022-0cff-49e8-96c0-ef6334718552"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.307357 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "22bd0022-0cff-49e8-96c0-ef6334718552" (UID: "22bd0022-0cff-49e8-96c0-ef6334718552"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.308422 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "22bd0022-0cff-49e8-96c0-ef6334718552" (UID: "22bd0022-0cff-49e8-96c0-ef6334718552"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.311774 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "22bd0022-0cff-49e8-96c0-ef6334718552" (UID: "22bd0022-0cff-49e8-96c0-ef6334718552"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.313350 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "22bd0022-0cff-49e8-96c0-ef6334718552" (UID: "22bd0022-0cff-49e8-96c0-ef6334718552"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.351809 4693 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.351839 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.351851 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.351860 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.351868 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggwws\" (UniqueName: \"kubernetes.io/projected/22bd0022-0cff-49e8-96c0-ef6334718552-kube-api-access-ggwws\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.351878 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22bd0022-0cff-49e8-96c0-ef6334718552-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.410107 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.555405 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-config-data\") pod \"0123f593-4acf-4645-83e0-dbea0580023b\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.555467 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0123f593-4acf-4645-83e0-dbea0580023b-run-httpd\") pod \"0123f593-4acf-4645-83e0-dbea0580023b\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.555505 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-sg-core-conf-yaml\") pod \"0123f593-4acf-4645-83e0-dbea0580023b\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.555531 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-ceilometer-tls-certs\") pod \"0123f593-4acf-4645-83e0-dbea0580023b\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.555596 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-scripts\") pod \"0123f593-4acf-4645-83e0-dbea0580023b\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.555643 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-combined-ca-bundle\") pod \"0123f593-4acf-4645-83e0-dbea0580023b\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.555817 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0123f593-4acf-4645-83e0-dbea0580023b-log-httpd\") pod \"0123f593-4acf-4645-83e0-dbea0580023b\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.555850 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0123f593-4acf-4645-83e0-dbea0580023b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0123f593-4acf-4645-83e0-dbea0580023b" (UID: "0123f593-4acf-4645-83e0-dbea0580023b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.555941 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkz4b\" (UniqueName: \"kubernetes.io/projected/0123f593-4acf-4645-83e0-dbea0580023b-kube-api-access-vkz4b\") pod \"0123f593-4acf-4645-83e0-dbea0580023b\" (UID: \"0123f593-4acf-4645-83e0-dbea0580023b\") " Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.556418 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0123f593-4acf-4645-83e0-dbea0580023b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0123f593-4acf-4645-83e0-dbea0580023b" (UID: "0123f593-4acf-4645-83e0-dbea0580023b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.557238 4693 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0123f593-4acf-4645-83e0-dbea0580023b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.557291 4693 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0123f593-4acf-4645-83e0-dbea0580023b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.560929 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0123f593-4acf-4645-83e0-dbea0580023b-kube-api-access-vkz4b" (OuterVolumeSpecName: "kube-api-access-vkz4b") pod "0123f593-4acf-4645-83e0-dbea0580023b" (UID: "0123f593-4acf-4645-83e0-dbea0580023b"). InnerVolumeSpecName "kube-api-access-vkz4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.564177 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-scripts" (OuterVolumeSpecName: "scripts") pod "0123f593-4acf-4645-83e0-dbea0580023b" (UID: "0123f593-4acf-4645-83e0-dbea0580023b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.599706 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0123f593-4acf-4645-83e0-dbea0580023b" (UID: "0123f593-4acf-4645-83e0-dbea0580023b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.622712 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "0123f593-4acf-4645-83e0-dbea0580023b" (UID: "0123f593-4acf-4645-83e0-dbea0580023b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.630758 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0123f593-4acf-4645-83e0-dbea0580023b" (UID: "0123f593-4acf-4645-83e0-dbea0580023b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.656960 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-config-data" (OuterVolumeSpecName: "config-data") pod "0123f593-4acf-4645-83e0-dbea0580023b" (UID: "0123f593-4acf-4645-83e0-dbea0580023b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.659492 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkz4b\" (UniqueName: \"kubernetes.io/projected/0123f593-4acf-4645-83e0-dbea0580023b-kube-api-access-vkz4b\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.659531 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.659547 4693 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.659559 4693 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.659569 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:52 crc kubenswrapper[4693]: I1204 10:08:52.659580 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0123f593-4acf-4645-83e0-dbea0580023b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.032848 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0123f593-4acf-4645-83e0-dbea0580023b","Type":"ContainerDied","Data":"289650d7e2fe85f24011b8931249579748f59c0e1332243343f9a70271025da1"} Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.033206 4693 scope.go:117] "RemoveContainer" containerID="edd322935d0415c9d51e340aa8938298ff3c386bedcc9bbb3d06a6caa165387b" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.033430 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.050619 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-858594bc89-xx8tz" event={"ID":"22bd0022-0cff-49e8-96c0-ef6334718552","Type":"ContainerDied","Data":"20d8d6b0328fee94b791f01a150ef52396dd44b8c5ba893b00152728c8acb166"} Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.050661 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858594bc89-xx8tz" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.107748 4693 scope.go:117] "RemoveContainer" containerID="81e87075d58bbf2789d2cd1851b2d28684e1fefe6943809ab5d065f17b259479" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.127273 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-858594bc89-xx8tz"] Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.135887 4693 scope.go:117] "RemoveContainer" containerID="df5661e5faf7eae0545daf2f499c13399fc8a40b7acfb8d9fb62b68dafb0589f" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.136841 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-858594bc89-xx8tz"] Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.159891 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.170115 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.173264 4693 scope.go:117] "RemoveContainer" containerID="a624776e6233eb66a369870145960064fca8f704ca7eb0ff27b7a2385a22ea40" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.184510 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:08:53 crc kubenswrapper[4693]: E1204 10:08:53.184964 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0123f593-4acf-4645-83e0-dbea0580023b" containerName="proxy-httpd" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.184988 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0123f593-4acf-4645-83e0-dbea0580023b" containerName="proxy-httpd" Dec 04 10:08:53 crc kubenswrapper[4693]: E1204 10:08:53.185000 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22bd0022-0cff-49e8-96c0-ef6334718552" containerName="dnsmasq-dns" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.185009 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="22bd0022-0cff-49e8-96c0-ef6334718552" containerName="dnsmasq-dns" Dec 04 10:08:53 crc kubenswrapper[4693]: E1204 10:08:53.185020 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0123f593-4acf-4645-83e0-dbea0580023b" containerName="sg-core" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.185027 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0123f593-4acf-4645-83e0-dbea0580023b" containerName="sg-core" Dec 04 10:08:53 crc kubenswrapper[4693]: E1204 10:08:53.185058 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22bd0022-0cff-49e8-96c0-ef6334718552" containerName="init" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.185065 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="22bd0022-0cff-49e8-96c0-ef6334718552" containerName="init" Dec 04 10:08:53 crc kubenswrapper[4693]: E1204 10:08:53.185079 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0123f593-4acf-4645-83e0-dbea0580023b" containerName="ceilometer-central-agent" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.185087 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0123f593-4acf-4645-83e0-dbea0580023b" containerName="ceilometer-central-agent" Dec 04 10:08:53 crc kubenswrapper[4693]: E1204 10:08:53.185096 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0123f593-4acf-4645-83e0-dbea0580023b" containerName="ceilometer-notification-agent" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.185104 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0123f593-4acf-4645-83e0-dbea0580023b" containerName="ceilometer-notification-agent" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.185355 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="0123f593-4acf-4645-83e0-dbea0580023b" containerName="ceilometer-notification-agent" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.185377 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="0123f593-4acf-4645-83e0-dbea0580023b" containerName="proxy-httpd" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.185391 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="0123f593-4acf-4645-83e0-dbea0580023b" containerName="sg-core" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.185406 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="0123f593-4acf-4645-83e0-dbea0580023b" containerName="ceilometer-central-agent" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.185421 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="22bd0022-0cff-49e8-96c0-ef6334718552" containerName="dnsmasq-dns" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.187211 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.190299 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.196192 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.196462 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.205532 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.210420 4693 scope.go:117] "RemoveContainer" containerID="386ce87ef53f449cb8ab5a574e316340291000e5bc74d3c48282243b9de714db" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.228533 4693 scope.go:117] "RemoveContainer" containerID="ba46fc675691f9114426b5cf58b98e7ff1de034a08c56cea4e5c1f5de6910c3e" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.272616 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3fcae802-3512-4246-bbe8-fc93ecb2505d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.272962 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcae802-3512-4246-bbe8-fc93ecb2505d-log-httpd\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.273123 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fcae802-3512-4246-bbe8-fc93ecb2505d-config-data\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.273268 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fcae802-3512-4246-bbe8-fc93ecb2505d-scripts\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.273535 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fcae802-3512-4246-bbe8-fc93ecb2505d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.273657 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcae802-3512-4246-bbe8-fc93ecb2505d-run-httpd\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.273748 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fcae802-3512-4246-bbe8-fc93ecb2505d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.273907 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4djs6\" (UniqueName: \"kubernetes.io/projected/3fcae802-3512-4246-bbe8-fc93ecb2505d-kube-api-access-4djs6\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.376141 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3fcae802-3512-4246-bbe8-fc93ecb2505d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.376211 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcae802-3512-4246-bbe8-fc93ecb2505d-log-httpd\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.376240 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fcae802-3512-4246-bbe8-fc93ecb2505d-config-data\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.376271 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fcae802-3512-4246-bbe8-fc93ecb2505d-scripts\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.376389 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fcae802-3512-4246-bbe8-fc93ecb2505d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.376429 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcae802-3512-4246-bbe8-fc93ecb2505d-run-httpd\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.376451 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fcae802-3512-4246-bbe8-fc93ecb2505d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.376492 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4djs6\" (UniqueName: \"kubernetes.io/projected/3fcae802-3512-4246-bbe8-fc93ecb2505d-kube-api-access-4djs6\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.376703 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcae802-3512-4246-bbe8-fc93ecb2505d-log-httpd\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.377275 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcae802-3512-4246-bbe8-fc93ecb2505d-run-httpd\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.381769 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fcae802-3512-4246-bbe8-fc93ecb2505d-scripts\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.382108 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fcae802-3512-4246-bbe8-fc93ecb2505d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.382139 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fcae802-3512-4246-bbe8-fc93ecb2505d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.382937 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fcae802-3512-4246-bbe8-fc93ecb2505d-config-data\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.391443 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3fcae802-3512-4246-bbe8-fc93ecb2505d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.393946 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4djs6\" (UniqueName: \"kubernetes.io/projected/3fcae802-3512-4246-bbe8-fc93ecb2505d-kube-api-access-4djs6\") pod \"ceilometer-0\" (UID: \"3fcae802-3512-4246-bbe8-fc93ecb2505d\") " pod="openstack/ceilometer-0" Dec 04 10:08:53 crc kubenswrapper[4693]: I1204 10:08:53.509361 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:08:54 crc kubenswrapper[4693]: W1204 10:08:54.000718 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fcae802_3512_4246_bbe8_fc93ecb2505d.slice/crio-db5183a77337596ea2d0e4a8417dce1aed30514f5e7fa8de94c6282fd10d0ea1 WatchSource:0}: Error finding container db5183a77337596ea2d0e4a8417dce1aed30514f5e7fa8de94c6282fd10d0ea1: Status 404 returned error can't find the container with id db5183a77337596ea2d0e4a8417dce1aed30514f5e7fa8de94c6282fd10d0ea1 Dec 04 10:08:54 crc kubenswrapper[4693]: I1204 10:08:54.002591 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:08:54 crc kubenswrapper[4693]: I1204 10:08:54.065734 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fcae802-3512-4246-bbe8-fc93ecb2505d","Type":"ContainerStarted","Data":"db5183a77337596ea2d0e4a8417dce1aed30514f5e7fa8de94c6282fd10d0ea1"} Dec 04 10:08:54 crc kubenswrapper[4693]: I1204 10:08:54.473165 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0123f593-4acf-4645-83e0-dbea0580023b" path="/var/lib/kubelet/pods/0123f593-4acf-4645-83e0-dbea0580023b/volumes" Dec 04 10:08:54 crc kubenswrapper[4693]: I1204 10:08:54.474308 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22bd0022-0cff-49e8-96c0-ef6334718552" path="/var/lib/kubelet/pods/22bd0022-0cff-49e8-96c0-ef6334718552/volumes" Dec 04 10:08:55 crc kubenswrapper[4693]: I1204 10:08:55.078905 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fcae802-3512-4246-bbe8-fc93ecb2505d","Type":"ContainerStarted","Data":"92b9470736ac0ad77368c4695b7e242faf9c83612a8eeb4c0e538e4dc07ab03a"} Dec 04 10:08:56 crc kubenswrapper[4693]: I1204 10:08:56.090366 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fcae802-3512-4246-bbe8-fc93ecb2505d","Type":"ContainerStarted","Data":"576a58095b13d101e172cca0f175a7931f641a146f7afb6b86621751e1749a68"} Dec 04 10:08:57 crc kubenswrapper[4693]: I1204 10:08:57.104032 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fcae802-3512-4246-bbe8-fc93ecb2505d","Type":"ContainerStarted","Data":"45a2a1becaa5ad53cacd45273d2cdce8d3ea61149ed445617ff11f559e9a01e8"} Dec 04 10:08:57 crc kubenswrapper[4693]: I1204 10:08:57.106027 4693 generic.go:334] "Generic (PLEG): container finished" podID="f223e615-c099-4f70-b613-3f438d533326" containerID="c530c346a3e1f67d5d8af849cfd15ff925733b6c421d92b1d21044479d983558" exitCode=0 Dec 04 10:08:57 crc kubenswrapper[4693]: I1204 10:08:57.106069 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pp9r5" event={"ID":"f223e615-c099-4f70-b613-3f438d533326","Type":"ContainerDied","Data":"c530c346a3e1f67d5d8af849cfd15ff925733b6c421d92b1d21044479d983558"} Dec 04 10:08:58 crc kubenswrapper[4693]: I1204 10:08:58.378197 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:08:58 crc kubenswrapper[4693]: I1204 10:08:58.378450 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:08:58 crc kubenswrapper[4693]: I1204 10:08:58.532749 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pp9r5" Dec 04 10:08:58 crc kubenswrapper[4693]: I1204 10:08:58.681694 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f223e615-c099-4f70-b613-3f438d533326-combined-ca-bundle\") pod \"f223e615-c099-4f70-b613-3f438d533326\" (UID: \"f223e615-c099-4f70-b613-3f438d533326\") " Dec 04 10:08:58 crc kubenswrapper[4693]: I1204 10:08:58.681766 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f223e615-c099-4f70-b613-3f438d533326-scripts\") pod \"f223e615-c099-4f70-b613-3f438d533326\" (UID: \"f223e615-c099-4f70-b613-3f438d533326\") " Dec 04 10:08:58 crc kubenswrapper[4693]: I1204 10:08:58.681874 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkrck\" (UniqueName: \"kubernetes.io/projected/f223e615-c099-4f70-b613-3f438d533326-kube-api-access-wkrck\") pod \"f223e615-c099-4f70-b613-3f438d533326\" (UID: \"f223e615-c099-4f70-b613-3f438d533326\") " Dec 04 10:08:58 crc kubenswrapper[4693]: I1204 10:08:58.682058 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f223e615-c099-4f70-b613-3f438d533326-config-data\") pod \"f223e615-c099-4f70-b613-3f438d533326\" (UID: \"f223e615-c099-4f70-b613-3f438d533326\") " Dec 04 10:08:58 crc kubenswrapper[4693]: I1204 10:08:58.685259 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f223e615-c099-4f70-b613-3f438d533326-scripts" (OuterVolumeSpecName: "scripts") pod "f223e615-c099-4f70-b613-3f438d533326" (UID: "f223e615-c099-4f70-b613-3f438d533326"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:58 crc kubenswrapper[4693]: I1204 10:08:58.685349 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f223e615-c099-4f70-b613-3f438d533326-kube-api-access-wkrck" (OuterVolumeSpecName: "kube-api-access-wkrck") pod "f223e615-c099-4f70-b613-3f438d533326" (UID: "f223e615-c099-4f70-b613-3f438d533326"). InnerVolumeSpecName "kube-api-access-wkrck". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:08:58 crc kubenswrapper[4693]: I1204 10:08:58.724537 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f223e615-c099-4f70-b613-3f438d533326-config-data" (OuterVolumeSpecName: "config-data") pod "f223e615-c099-4f70-b613-3f438d533326" (UID: "f223e615-c099-4f70-b613-3f438d533326"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:58 crc kubenswrapper[4693]: I1204 10:08:58.729268 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f223e615-c099-4f70-b613-3f438d533326-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f223e615-c099-4f70-b613-3f438d533326" (UID: "f223e615-c099-4f70-b613-3f438d533326"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:58 crc kubenswrapper[4693]: I1204 10:08:58.784319 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f223e615-c099-4f70-b613-3f438d533326-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:58 crc kubenswrapper[4693]: I1204 10:08:58.784386 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f223e615-c099-4f70-b613-3f438d533326-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:58 crc kubenswrapper[4693]: I1204 10:08:58.784404 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f223e615-c099-4f70-b613-3f438d533326-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:58 crc kubenswrapper[4693]: I1204 10:08:58.784417 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkrck\" (UniqueName: \"kubernetes.io/projected/f223e615-c099-4f70-b613-3f438d533326-kube-api-access-wkrck\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:59 crc kubenswrapper[4693]: I1204 10:08:59.126257 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pp9r5" event={"ID":"f223e615-c099-4f70-b613-3f438d533326","Type":"ContainerDied","Data":"2dc94492243434980426891ecc04986a1f00d15e557cc4aa31eae3ba4df501b4"} Dec 04 10:08:59 crc kubenswrapper[4693]: I1204 10:08:59.126308 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dc94492243434980426891ecc04986a1f00d15e557cc4aa31eae3ba4df501b4" Dec 04 10:08:59 crc kubenswrapper[4693]: I1204 10:08:59.126317 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pp9r5" Dec 04 10:08:59 crc kubenswrapper[4693]: I1204 10:08:59.329966 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:08:59 crc kubenswrapper[4693]: I1204 10:08:59.330259 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="53f4ea25-b785-4d6d-a740-7f0bf798fba0" containerName="nova-api-log" containerID="cri-o://cde263828393acff06af27a46e3b573fbf1d695fbdde1e406ea404f77d2ea042" gracePeriod=30 Dec 04 10:08:59 crc kubenswrapper[4693]: I1204 10:08:59.330377 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="53f4ea25-b785-4d6d-a740-7f0bf798fba0" containerName="nova-api-api" containerID="cri-o://e82df1afb12221d8a7d8e63b9675df9bdcabd3683c5c66db219972cd63432172" gracePeriod=30 Dec 04 10:08:59 crc kubenswrapper[4693]: I1204 10:08:59.343220 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="53f4ea25-b785-4d6d-a740-7f0bf798fba0" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.218:8774/\": EOF" Dec 04 10:08:59 crc kubenswrapper[4693]: I1204 10:08:59.343235 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="53f4ea25-b785-4d6d-a740-7f0bf798fba0" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.218:8774/\": EOF" Dec 04 10:08:59 crc kubenswrapper[4693]: I1204 10:08:59.347880 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:08:59 crc kubenswrapper[4693]: I1204 10:08:59.348119 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0f92e892-caed-4dd8-ae71-62e8d5be9d56" containerName="nova-scheduler-scheduler" containerID="cri-o://e6995bf5d59c84b0b9bbd35f223a9113981613da0f6f10e07e7223fa08a9796f" gracePeriod=30 Dec 04 10:08:59 crc kubenswrapper[4693]: I1204 10:08:59.378579 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:08:59 crc kubenswrapper[4693]: I1204 10:08:59.378859 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c416e59a-b555-493d-8171-f6bbfc91c7a3" containerName="nova-metadata-log" containerID="cri-o://95f4429c6fa53858daf344ce3c7e9872d146bc7800c44f64671f28050dccb9e9" gracePeriod=30 Dec 04 10:08:59 crc kubenswrapper[4693]: I1204 10:08:59.379292 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c416e59a-b555-493d-8171-f6bbfc91c7a3" containerName="nova-metadata-metadata" containerID="cri-o://cbc77ffebe84e8c9c9b5bcf9d2b075445168f6f350ebaf4cdb881652ec338c6b" gracePeriod=30 Dec 04 10:08:59 crc kubenswrapper[4693]: E1204 10:08:59.641450 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6995bf5d59c84b0b9bbd35f223a9113981613da0f6f10e07e7223fa08a9796f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 10:08:59 crc kubenswrapper[4693]: E1204 10:08:59.643278 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6995bf5d59c84b0b9bbd35f223a9113981613da0f6f10e07e7223fa08a9796f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 10:08:59 crc kubenswrapper[4693]: E1204 10:08:59.644589 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6995bf5d59c84b0b9bbd35f223a9113981613da0f6f10e07e7223fa08a9796f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 10:08:59 crc kubenswrapper[4693]: E1204 10:08:59.644634 4693 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0f92e892-caed-4dd8-ae71-62e8d5be9d56" containerName="nova-scheduler-scheduler" Dec 04 10:09:00 crc kubenswrapper[4693]: I1204 10:09:00.139642 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fcae802-3512-4246-bbe8-fc93ecb2505d","Type":"ContainerStarted","Data":"2f2912cb1117e5abefeb54043d861a99be376557d8c7d325e573f4ddf31c5835"} Dec 04 10:09:00 crc kubenswrapper[4693]: I1204 10:09:00.139825 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:09:00 crc kubenswrapper[4693]: I1204 10:09:00.142425 4693 generic.go:334] "Generic (PLEG): container finished" podID="53f4ea25-b785-4d6d-a740-7f0bf798fba0" containerID="cde263828393acff06af27a46e3b573fbf1d695fbdde1e406ea404f77d2ea042" exitCode=143 Dec 04 10:09:00 crc kubenswrapper[4693]: I1204 10:09:00.142483 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"53f4ea25-b785-4d6d-a740-7f0bf798fba0","Type":"ContainerDied","Data":"cde263828393acff06af27a46e3b573fbf1d695fbdde1e406ea404f77d2ea042"} Dec 04 10:09:00 crc kubenswrapper[4693]: I1204 10:09:00.144609 4693 generic.go:334] "Generic (PLEG): container finished" podID="c416e59a-b555-493d-8171-f6bbfc91c7a3" containerID="95f4429c6fa53858daf344ce3c7e9872d146bc7800c44f64671f28050dccb9e9" exitCode=143 Dec 04 10:09:00 crc kubenswrapper[4693]: I1204 10:09:00.144640 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c416e59a-b555-493d-8171-f6bbfc91c7a3","Type":"ContainerDied","Data":"95f4429c6fa53858daf344ce3c7e9872d146bc7800c44f64671f28050dccb9e9"} Dec 04 10:09:00 crc kubenswrapper[4693]: I1204 10:09:00.179878 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.541601086 podStartE2EDuration="7.179846881s" podCreationTimestamp="2025-12-04 10:08:53 +0000 UTC" firstStartedPulling="2025-12-04 10:08:54.005071347 +0000 UTC m=+1579.902665100" lastFinishedPulling="2025-12-04 10:08:58.643317142 +0000 UTC m=+1584.540910895" observedRunningTime="2025-12-04 10:09:00.170220055 +0000 UTC m=+1586.067813808" watchObservedRunningTime="2025-12-04 10:09:00.179846881 +0000 UTC m=+1586.077440644" Dec 04 10:09:01 crc kubenswrapper[4693]: I1204 10:09:01.301688 4693 trace.go:236] Trace[1010677127]: "Calculate volume metrics of catalog-content for pod openshift-marketplace/community-operators-mm9jb" (04-Dec-2025 10:09:00.207) (total time: 1094ms): Dec 04 10:09:01 crc kubenswrapper[4693]: Trace[1010677127]: [1.094437375s] [1.094437375s] END Dec 04 10:09:02 crc kubenswrapper[4693]: I1204 10:09:02.515765 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c416e59a-b555-493d-8171-f6bbfc91c7a3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": read tcp 10.217.0.2:56584->10.217.0.209:8775: read: connection reset by peer" Dec 04 10:09:02 crc kubenswrapper[4693]: I1204 10:09:02.516387 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c416e59a-b555-493d-8171-f6bbfc91c7a3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": read tcp 10.217.0.2:56586->10.217.0.209:8775: read: connection reset by peer" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.024469 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.176538 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c416e59a-b555-493d-8171-f6bbfc91c7a3-combined-ca-bundle\") pod \"c416e59a-b555-493d-8171-f6bbfc91c7a3\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.176890 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c416e59a-b555-493d-8171-f6bbfc91c7a3-logs\") pod \"c416e59a-b555-493d-8171-f6bbfc91c7a3\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.177013 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c416e59a-b555-493d-8171-f6bbfc91c7a3-config-data\") pod \"c416e59a-b555-493d-8171-f6bbfc91c7a3\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.177033 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf7hz\" (UniqueName: \"kubernetes.io/projected/c416e59a-b555-493d-8171-f6bbfc91c7a3-kube-api-access-nf7hz\") pod \"c416e59a-b555-493d-8171-f6bbfc91c7a3\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.177052 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c416e59a-b555-493d-8171-f6bbfc91c7a3-nova-metadata-tls-certs\") pod \"c416e59a-b555-493d-8171-f6bbfc91c7a3\" (UID: \"c416e59a-b555-493d-8171-f6bbfc91c7a3\") " Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.179438 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c416e59a-b555-493d-8171-f6bbfc91c7a3-logs" (OuterVolumeSpecName: "logs") pod "c416e59a-b555-493d-8171-f6bbfc91c7a3" (UID: "c416e59a-b555-493d-8171-f6bbfc91c7a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.185690 4693 generic.go:334] "Generic (PLEG): container finished" podID="c416e59a-b555-493d-8171-f6bbfc91c7a3" containerID="cbc77ffebe84e8c9c9b5bcf9d2b075445168f6f350ebaf4cdb881652ec338c6b" exitCode=0 Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.185736 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c416e59a-b555-493d-8171-f6bbfc91c7a3","Type":"ContainerDied","Data":"cbc77ffebe84e8c9c9b5bcf9d2b075445168f6f350ebaf4cdb881652ec338c6b"} Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.185767 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c416e59a-b555-493d-8171-f6bbfc91c7a3","Type":"ContainerDied","Data":"264e6f4f4efa87a31bf872c2f3c21f9765ba82a8fb2b69f22ed6dd0700287c68"} Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.185789 4693 scope.go:117] "RemoveContainer" containerID="cbc77ffebe84e8c9c9b5bcf9d2b075445168f6f350ebaf4cdb881652ec338c6b" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.185926 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.193738 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c416e59a-b555-493d-8171-f6bbfc91c7a3-kube-api-access-nf7hz" (OuterVolumeSpecName: "kube-api-access-nf7hz") pod "c416e59a-b555-493d-8171-f6bbfc91c7a3" (UID: "c416e59a-b555-493d-8171-f6bbfc91c7a3"). InnerVolumeSpecName "kube-api-access-nf7hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.223767 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c416e59a-b555-493d-8171-f6bbfc91c7a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c416e59a-b555-493d-8171-f6bbfc91c7a3" (UID: "c416e59a-b555-493d-8171-f6bbfc91c7a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.235999 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c416e59a-b555-493d-8171-f6bbfc91c7a3-config-data" (OuterVolumeSpecName: "config-data") pod "c416e59a-b555-493d-8171-f6bbfc91c7a3" (UID: "c416e59a-b555-493d-8171-f6bbfc91c7a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.267743 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c416e59a-b555-493d-8171-f6bbfc91c7a3-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c416e59a-b555-493d-8171-f6bbfc91c7a3" (UID: "c416e59a-b555-493d-8171-f6bbfc91c7a3"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.279082 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c416e59a-b555-493d-8171-f6bbfc91c7a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.279125 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c416e59a-b555-493d-8171-f6bbfc91c7a3-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.279139 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c416e59a-b555-493d-8171-f6bbfc91c7a3-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.279151 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf7hz\" (UniqueName: \"kubernetes.io/projected/c416e59a-b555-493d-8171-f6bbfc91c7a3-kube-api-access-nf7hz\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.279165 4693 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c416e59a-b555-493d-8171-f6bbfc91c7a3-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.392911 4693 scope.go:117] "RemoveContainer" containerID="95f4429c6fa53858daf344ce3c7e9872d146bc7800c44f64671f28050dccb9e9" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.416152 4693 scope.go:117] "RemoveContainer" containerID="cbc77ffebe84e8c9c9b5bcf9d2b075445168f6f350ebaf4cdb881652ec338c6b" Dec 04 10:09:03 crc kubenswrapper[4693]: E1204 10:09:03.416532 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbc77ffebe84e8c9c9b5bcf9d2b075445168f6f350ebaf4cdb881652ec338c6b\": container with ID starting with cbc77ffebe84e8c9c9b5bcf9d2b075445168f6f350ebaf4cdb881652ec338c6b not found: ID does not exist" containerID="cbc77ffebe84e8c9c9b5bcf9d2b075445168f6f350ebaf4cdb881652ec338c6b" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.416583 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbc77ffebe84e8c9c9b5bcf9d2b075445168f6f350ebaf4cdb881652ec338c6b"} err="failed to get container status \"cbc77ffebe84e8c9c9b5bcf9d2b075445168f6f350ebaf4cdb881652ec338c6b\": rpc error: code = NotFound desc = could not find container \"cbc77ffebe84e8c9c9b5bcf9d2b075445168f6f350ebaf4cdb881652ec338c6b\": container with ID starting with cbc77ffebe84e8c9c9b5bcf9d2b075445168f6f350ebaf4cdb881652ec338c6b not found: ID does not exist" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.416605 4693 scope.go:117] "RemoveContainer" containerID="95f4429c6fa53858daf344ce3c7e9872d146bc7800c44f64671f28050dccb9e9" Dec 04 10:09:03 crc kubenswrapper[4693]: E1204 10:09:03.416875 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95f4429c6fa53858daf344ce3c7e9872d146bc7800c44f64671f28050dccb9e9\": container with ID starting with 95f4429c6fa53858daf344ce3c7e9872d146bc7800c44f64671f28050dccb9e9 not found: ID does not exist" containerID="95f4429c6fa53858daf344ce3c7e9872d146bc7800c44f64671f28050dccb9e9" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.416901 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95f4429c6fa53858daf344ce3c7e9872d146bc7800c44f64671f28050dccb9e9"} err="failed to get container status \"95f4429c6fa53858daf344ce3c7e9872d146bc7800c44f64671f28050dccb9e9\": rpc error: code = NotFound desc = could not find container \"95f4429c6fa53858daf344ce3c7e9872d146bc7800c44f64671f28050dccb9e9\": container with ID starting with 95f4429c6fa53858daf344ce3c7e9872d146bc7800c44f64671f28050dccb9e9 not found: ID does not exist" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.532497 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.557920 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.576628 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:09:03 crc kubenswrapper[4693]: E1204 10:09:03.577037 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c416e59a-b555-493d-8171-f6bbfc91c7a3" containerName="nova-metadata-log" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.577055 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c416e59a-b555-493d-8171-f6bbfc91c7a3" containerName="nova-metadata-log" Dec 04 10:09:03 crc kubenswrapper[4693]: E1204 10:09:03.577066 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c416e59a-b555-493d-8171-f6bbfc91c7a3" containerName="nova-metadata-metadata" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.577072 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c416e59a-b555-493d-8171-f6bbfc91c7a3" containerName="nova-metadata-metadata" Dec 04 10:09:03 crc kubenswrapper[4693]: E1204 10:09:03.577101 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f223e615-c099-4f70-b613-3f438d533326" containerName="nova-manage" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.577109 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f223e615-c099-4f70-b613-3f438d533326" containerName="nova-manage" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.577295 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="c416e59a-b555-493d-8171-f6bbfc91c7a3" containerName="nova-metadata-log" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.577316 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="c416e59a-b555-493d-8171-f6bbfc91c7a3" containerName="nova-metadata-metadata" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.577348 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f223e615-c099-4f70-b613-3f438d533326" containerName="nova-manage" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.578462 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.582762 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.582957 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.598027 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.689161 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc70aae2-116d-4528-8a5a-efab89d7e53b-config-data\") pod \"nova-metadata-0\" (UID: \"bc70aae2-116d-4528-8a5a-efab89d7e53b\") " pod="openstack/nova-metadata-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.689235 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc70aae2-116d-4528-8a5a-efab89d7e53b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bc70aae2-116d-4528-8a5a-efab89d7e53b\") " pod="openstack/nova-metadata-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.689524 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc70aae2-116d-4528-8a5a-efab89d7e53b-logs\") pod \"nova-metadata-0\" (UID: \"bc70aae2-116d-4528-8a5a-efab89d7e53b\") " pod="openstack/nova-metadata-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.689692 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2dfh\" (UniqueName: \"kubernetes.io/projected/bc70aae2-116d-4528-8a5a-efab89d7e53b-kube-api-access-h2dfh\") pod \"nova-metadata-0\" (UID: \"bc70aae2-116d-4528-8a5a-efab89d7e53b\") " pod="openstack/nova-metadata-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.689820 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc70aae2-116d-4528-8a5a-efab89d7e53b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bc70aae2-116d-4528-8a5a-efab89d7e53b\") " pod="openstack/nova-metadata-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.764191 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.792188 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc70aae2-116d-4528-8a5a-efab89d7e53b-logs\") pod \"nova-metadata-0\" (UID: \"bc70aae2-116d-4528-8a5a-efab89d7e53b\") " pod="openstack/nova-metadata-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.792266 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2dfh\" (UniqueName: \"kubernetes.io/projected/bc70aae2-116d-4528-8a5a-efab89d7e53b-kube-api-access-h2dfh\") pod \"nova-metadata-0\" (UID: \"bc70aae2-116d-4528-8a5a-efab89d7e53b\") " pod="openstack/nova-metadata-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.792317 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc70aae2-116d-4528-8a5a-efab89d7e53b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bc70aae2-116d-4528-8a5a-efab89d7e53b\") " pod="openstack/nova-metadata-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.792387 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc70aae2-116d-4528-8a5a-efab89d7e53b-config-data\") pod \"nova-metadata-0\" (UID: \"bc70aae2-116d-4528-8a5a-efab89d7e53b\") " pod="openstack/nova-metadata-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.792410 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc70aae2-116d-4528-8a5a-efab89d7e53b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bc70aae2-116d-4528-8a5a-efab89d7e53b\") " pod="openstack/nova-metadata-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.793156 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc70aae2-116d-4528-8a5a-efab89d7e53b-logs\") pod \"nova-metadata-0\" (UID: \"bc70aae2-116d-4528-8a5a-efab89d7e53b\") " pod="openstack/nova-metadata-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.796743 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc70aae2-116d-4528-8a5a-efab89d7e53b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bc70aae2-116d-4528-8a5a-efab89d7e53b\") " pod="openstack/nova-metadata-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.802068 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc70aae2-116d-4528-8a5a-efab89d7e53b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bc70aae2-116d-4528-8a5a-efab89d7e53b\") " pod="openstack/nova-metadata-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.818669 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc70aae2-116d-4528-8a5a-efab89d7e53b-config-data\") pod \"nova-metadata-0\" (UID: \"bc70aae2-116d-4528-8a5a-efab89d7e53b\") " pod="openstack/nova-metadata-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.824891 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2dfh\" (UniqueName: \"kubernetes.io/projected/bc70aae2-116d-4528-8a5a-efab89d7e53b-kube-api-access-h2dfh\") pod \"nova-metadata-0\" (UID: \"bc70aae2-116d-4528-8a5a-efab89d7e53b\") " pod="openstack/nova-metadata-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.894012 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8tkn\" (UniqueName: \"kubernetes.io/projected/0f92e892-caed-4dd8-ae71-62e8d5be9d56-kube-api-access-r8tkn\") pod \"0f92e892-caed-4dd8-ae71-62e8d5be9d56\" (UID: \"0f92e892-caed-4dd8-ae71-62e8d5be9d56\") " Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.894170 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f92e892-caed-4dd8-ae71-62e8d5be9d56-config-data\") pod \"0f92e892-caed-4dd8-ae71-62e8d5be9d56\" (UID: \"0f92e892-caed-4dd8-ae71-62e8d5be9d56\") " Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.894377 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f92e892-caed-4dd8-ae71-62e8d5be9d56-combined-ca-bundle\") pod \"0f92e892-caed-4dd8-ae71-62e8d5be9d56\" (UID: \"0f92e892-caed-4dd8-ae71-62e8d5be9d56\") " Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.897797 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f92e892-caed-4dd8-ae71-62e8d5be9d56-kube-api-access-r8tkn" (OuterVolumeSpecName: "kube-api-access-r8tkn") pod "0f92e892-caed-4dd8-ae71-62e8d5be9d56" (UID: "0f92e892-caed-4dd8-ae71-62e8d5be9d56"). InnerVolumeSpecName "kube-api-access-r8tkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.928163 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f92e892-caed-4dd8-ae71-62e8d5be9d56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f92e892-caed-4dd8-ae71-62e8d5be9d56" (UID: "0f92e892-caed-4dd8-ae71-62e8d5be9d56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.937397 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f92e892-caed-4dd8-ae71-62e8d5be9d56-config-data" (OuterVolumeSpecName: "config-data") pod "0f92e892-caed-4dd8-ae71-62e8d5be9d56" (UID: "0f92e892-caed-4dd8-ae71-62e8d5be9d56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.969321 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.997867 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f92e892-caed-4dd8-ae71-62e8d5be9d56-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.997898 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f92e892-caed-4dd8-ae71-62e8d5be9d56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:03 crc kubenswrapper[4693]: I1204 10:09:03.997911 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8tkn\" (UniqueName: \"kubernetes.io/projected/0f92e892-caed-4dd8-ae71-62e8d5be9d56-kube-api-access-r8tkn\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.203534 4693 generic.go:334] "Generic (PLEG): container finished" podID="0f92e892-caed-4dd8-ae71-62e8d5be9d56" containerID="e6995bf5d59c84b0b9bbd35f223a9113981613da0f6f10e07e7223fa08a9796f" exitCode=0 Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.203945 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0f92e892-caed-4dd8-ae71-62e8d5be9d56","Type":"ContainerDied","Data":"e6995bf5d59c84b0b9bbd35f223a9113981613da0f6f10e07e7223fa08a9796f"} Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.203988 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0f92e892-caed-4dd8-ae71-62e8d5be9d56","Type":"ContainerDied","Data":"f595e41308fa5c8bf4fce8699c854362991ad7bfa14a5185abf4ba28a4585c10"} Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.204006 4693 scope.go:117] "RemoveContainer" containerID="e6995bf5d59c84b0b9bbd35f223a9113981613da0f6f10e07e7223fa08a9796f" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.204013 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.227960 4693 scope.go:117] "RemoveContainer" containerID="e6995bf5d59c84b0b9bbd35f223a9113981613da0f6f10e07e7223fa08a9796f" Dec 04 10:09:04 crc kubenswrapper[4693]: E1204 10:09:04.228375 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6995bf5d59c84b0b9bbd35f223a9113981613da0f6f10e07e7223fa08a9796f\": container with ID starting with e6995bf5d59c84b0b9bbd35f223a9113981613da0f6f10e07e7223fa08a9796f not found: ID does not exist" containerID="e6995bf5d59c84b0b9bbd35f223a9113981613da0f6f10e07e7223fa08a9796f" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.228407 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6995bf5d59c84b0b9bbd35f223a9113981613da0f6f10e07e7223fa08a9796f"} err="failed to get container status \"e6995bf5d59c84b0b9bbd35f223a9113981613da0f6f10e07e7223fa08a9796f\": rpc error: code = NotFound desc = could not find container \"e6995bf5d59c84b0b9bbd35f223a9113981613da0f6f10e07e7223fa08a9796f\": container with ID starting with e6995bf5d59c84b0b9bbd35f223a9113981613da0f6f10e07e7223fa08a9796f not found: ID does not exist" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.262795 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.279510 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.294102 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:09:04 crc kubenswrapper[4693]: E1204 10:09:04.294797 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f92e892-caed-4dd8-ae71-62e8d5be9d56" containerName="nova-scheduler-scheduler" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.294818 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f92e892-caed-4dd8-ae71-62e8d5be9d56" containerName="nova-scheduler-scheduler" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.295091 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f92e892-caed-4dd8-ae71-62e8d5be9d56" containerName="nova-scheduler-scheduler" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.295893 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.297443 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.313144 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.408165 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca70cccf-b92e-4997-9ca9-1375a2cceca1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ca70cccf-b92e-4997-9ca9-1375a2cceca1\") " pod="openstack/nova-scheduler-0" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.408261 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca70cccf-b92e-4997-9ca9-1375a2cceca1-config-data\") pod \"nova-scheduler-0\" (UID: \"ca70cccf-b92e-4997-9ca9-1375a2cceca1\") " pod="openstack/nova-scheduler-0" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.408318 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqqw9\" (UniqueName: \"kubernetes.io/projected/ca70cccf-b92e-4997-9ca9-1375a2cceca1-kube-api-access-mqqw9\") pod \"nova-scheduler-0\" (UID: \"ca70cccf-b92e-4997-9ca9-1375a2cceca1\") " pod="openstack/nova-scheduler-0" Dec 04 10:09:04 crc kubenswrapper[4693]: W1204 10:09:04.446707 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc70aae2_116d_4528_8a5a_efab89d7e53b.slice/crio-d3fc6d2bc5045fd91b2d245a805a9504714762a5b038a34a25ab37552b042b5a WatchSource:0}: Error finding container d3fc6d2bc5045fd91b2d245a805a9504714762a5b038a34a25ab37552b042b5a: Status 404 returned error can't find the container with id d3fc6d2bc5045fd91b2d245a805a9504714762a5b038a34a25ab37552b042b5a Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.457968 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.471732 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f92e892-caed-4dd8-ae71-62e8d5be9d56" path="/var/lib/kubelet/pods/0f92e892-caed-4dd8-ae71-62e8d5be9d56/volumes" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.472317 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c416e59a-b555-493d-8171-f6bbfc91c7a3" path="/var/lib/kubelet/pods/c416e59a-b555-493d-8171-f6bbfc91c7a3/volumes" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.510059 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca70cccf-b92e-4997-9ca9-1375a2cceca1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ca70cccf-b92e-4997-9ca9-1375a2cceca1\") " pod="openstack/nova-scheduler-0" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.510140 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca70cccf-b92e-4997-9ca9-1375a2cceca1-config-data\") pod \"nova-scheduler-0\" (UID: \"ca70cccf-b92e-4997-9ca9-1375a2cceca1\") " pod="openstack/nova-scheduler-0" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.510185 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqqw9\" (UniqueName: \"kubernetes.io/projected/ca70cccf-b92e-4997-9ca9-1375a2cceca1-kube-api-access-mqqw9\") pod \"nova-scheduler-0\" (UID: \"ca70cccf-b92e-4997-9ca9-1375a2cceca1\") " pod="openstack/nova-scheduler-0" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.516190 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca70cccf-b92e-4997-9ca9-1375a2cceca1-config-data\") pod \"nova-scheduler-0\" (UID: \"ca70cccf-b92e-4997-9ca9-1375a2cceca1\") " pod="openstack/nova-scheduler-0" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.516209 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca70cccf-b92e-4997-9ca9-1375a2cceca1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ca70cccf-b92e-4997-9ca9-1375a2cceca1\") " pod="openstack/nova-scheduler-0" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.526493 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqqw9\" (UniqueName: \"kubernetes.io/projected/ca70cccf-b92e-4997-9ca9-1375a2cceca1-kube-api-access-mqqw9\") pod \"nova-scheduler-0\" (UID: \"ca70cccf-b92e-4997-9ca9-1375a2cceca1\") " pod="openstack/nova-scheduler-0" Dec 04 10:09:04 crc kubenswrapper[4693]: I1204 10:09:04.621112 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:09:05 crc kubenswrapper[4693]: I1204 10:09:05.049286 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:09:05 crc kubenswrapper[4693]: I1204 10:09:05.213573 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ca70cccf-b92e-4997-9ca9-1375a2cceca1","Type":"ContainerStarted","Data":"0ec981583778c5326383213a139994aaa9cd8fa7870789e07dc0733ee3a1325e"} Dec 04 10:09:05 crc kubenswrapper[4693]: I1204 10:09:05.215281 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bc70aae2-116d-4528-8a5a-efab89d7e53b","Type":"ContainerStarted","Data":"e1ddec4aa255e5afd9eea63a8bd819c01f56a85a73dfaa2b38ae2697baa61ddb"} Dec 04 10:09:05 crc kubenswrapper[4693]: I1204 10:09:05.215396 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bc70aae2-116d-4528-8a5a-efab89d7e53b","Type":"ContainerStarted","Data":"d3fc6d2bc5045fd91b2d245a805a9504714762a5b038a34a25ab37552b042b5a"} Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.229064 4693 generic.go:334] "Generic (PLEG): container finished" podID="53f4ea25-b785-4d6d-a740-7f0bf798fba0" containerID="e82df1afb12221d8a7d8e63b9675df9bdcabd3683c5c66db219972cd63432172" exitCode=0 Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.229144 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"53f4ea25-b785-4d6d-a740-7f0bf798fba0","Type":"ContainerDied","Data":"e82df1afb12221d8a7d8e63b9675df9bdcabd3683c5c66db219972cd63432172"} Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.233439 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ca70cccf-b92e-4997-9ca9-1375a2cceca1","Type":"ContainerStarted","Data":"8b57355f76cdaf1c8cd4f3236c6ce44e2bce0c3232a5fd4cdc5a81fcdc68341c"} Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.237047 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bc70aae2-116d-4528-8a5a-efab89d7e53b","Type":"ContainerStarted","Data":"789726d33ab84a34faddd3e25f518e3be7a5a67634d38e0106f8108349fb8d69"} Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.269426 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.2694016 podStartE2EDuration="2.2694016s" podCreationTimestamp="2025-12-04 10:09:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:09:06.258724485 +0000 UTC m=+1592.156318238" watchObservedRunningTime="2025-12-04 10:09:06.2694016 +0000 UTC m=+1592.166995353" Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.280029 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.280012914 podStartE2EDuration="3.280012914s" podCreationTimestamp="2025-12-04 10:09:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:09:06.277226727 +0000 UTC m=+1592.174820480" watchObservedRunningTime="2025-12-04 10:09:06.280012914 +0000 UTC m=+1592.177606667" Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.502413 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.657040 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd62v\" (UniqueName: \"kubernetes.io/projected/53f4ea25-b785-4d6d-a740-7f0bf798fba0-kube-api-access-dd62v\") pod \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.657137 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53f4ea25-b785-4d6d-a740-7f0bf798fba0-logs\") pod \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.657168 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-combined-ca-bundle\") pod \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.657238 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-internal-tls-certs\") pod \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.657293 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-config-data\") pod \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.657368 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-public-tls-certs\") pod \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\" (UID: \"53f4ea25-b785-4d6d-a740-7f0bf798fba0\") " Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.658002 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53f4ea25-b785-4d6d-a740-7f0bf798fba0-logs" (OuterVolumeSpecName: "logs") pod "53f4ea25-b785-4d6d-a740-7f0bf798fba0" (UID: "53f4ea25-b785-4d6d-a740-7f0bf798fba0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.663518 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53f4ea25-b785-4d6d-a740-7f0bf798fba0-kube-api-access-dd62v" (OuterVolumeSpecName: "kube-api-access-dd62v") pod "53f4ea25-b785-4d6d-a740-7f0bf798fba0" (UID: "53f4ea25-b785-4d6d-a740-7f0bf798fba0"). InnerVolumeSpecName "kube-api-access-dd62v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.685093 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-config-data" (OuterVolumeSpecName: "config-data") pod "53f4ea25-b785-4d6d-a740-7f0bf798fba0" (UID: "53f4ea25-b785-4d6d-a740-7f0bf798fba0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.698240 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53f4ea25-b785-4d6d-a740-7f0bf798fba0" (UID: "53f4ea25-b785-4d6d-a740-7f0bf798fba0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.713494 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "53f4ea25-b785-4d6d-a740-7f0bf798fba0" (UID: "53f4ea25-b785-4d6d-a740-7f0bf798fba0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.719915 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "53f4ea25-b785-4d6d-a740-7f0bf798fba0" (UID: "53f4ea25-b785-4d6d-a740-7f0bf798fba0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.759929 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.759978 4693 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.759992 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd62v\" (UniqueName: \"kubernetes.io/projected/53f4ea25-b785-4d6d-a740-7f0bf798fba0-kube-api-access-dd62v\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.760006 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53f4ea25-b785-4d6d-a740-7f0bf798fba0-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.760014 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:06 crc kubenswrapper[4693]: I1204 10:09:06.760024 4693 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53f4ea25-b785-4d6d-a740-7f0bf798fba0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.256692 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.261504 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"53f4ea25-b785-4d6d-a740-7f0bf798fba0","Type":"ContainerDied","Data":"a61ce5aab7565cd83707277cb637634414f625f2f2e7d049f814b5f03d188347"} Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.261595 4693 scope.go:117] "RemoveContainer" containerID="e82df1afb12221d8a7d8e63b9675df9bdcabd3683c5c66db219972cd63432172" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.352642 4693 scope.go:117] "RemoveContainer" containerID="cde263828393acff06af27a46e3b573fbf1d695fbdde1e406ea404f77d2ea042" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.368421 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.379926 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.388913 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 10:09:07 crc kubenswrapper[4693]: E1204 10:09:07.389534 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f4ea25-b785-4d6d-a740-7f0bf798fba0" containerName="nova-api-api" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.389552 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f4ea25-b785-4d6d-a740-7f0bf798fba0" containerName="nova-api-api" Dec 04 10:09:07 crc kubenswrapper[4693]: E1204 10:09:07.389592 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f4ea25-b785-4d6d-a740-7f0bf798fba0" containerName="nova-api-log" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.389599 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f4ea25-b785-4d6d-a740-7f0bf798fba0" containerName="nova-api-log" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.389871 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="53f4ea25-b785-4d6d-a740-7f0bf798fba0" containerName="nova-api-log" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.389907 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="53f4ea25-b785-4d6d-a740-7f0bf798fba0" containerName="nova-api-api" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.391315 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.402780 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.402913 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.410786 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.413118 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.474190 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d658460b-438a-46f0-88e1-136741999c81-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d658460b-438a-46f0-88e1-136741999c81\") " pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.474268 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d658460b-438a-46f0-88e1-136741999c81-public-tls-certs\") pod \"nova-api-0\" (UID: \"d658460b-438a-46f0-88e1-136741999c81\") " pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.474382 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d658460b-438a-46f0-88e1-136741999c81-config-data\") pod \"nova-api-0\" (UID: \"d658460b-438a-46f0-88e1-136741999c81\") " pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.474417 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwc75\" (UniqueName: \"kubernetes.io/projected/d658460b-438a-46f0-88e1-136741999c81-kube-api-access-rwc75\") pod \"nova-api-0\" (UID: \"d658460b-438a-46f0-88e1-136741999c81\") " pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.474495 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d658460b-438a-46f0-88e1-136741999c81-logs\") pod \"nova-api-0\" (UID: \"d658460b-438a-46f0-88e1-136741999c81\") " pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.474525 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d658460b-438a-46f0-88e1-136741999c81-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d658460b-438a-46f0-88e1-136741999c81\") " pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.577160 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d658460b-438a-46f0-88e1-136741999c81-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d658460b-438a-46f0-88e1-136741999c81\") " pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.577250 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d658460b-438a-46f0-88e1-136741999c81-public-tls-certs\") pod \"nova-api-0\" (UID: \"d658460b-438a-46f0-88e1-136741999c81\") " pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.577295 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d658460b-438a-46f0-88e1-136741999c81-config-data\") pod \"nova-api-0\" (UID: \"d658460b-438a-46f0-88e1-136741999c81\") " pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.577319 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwc75\" (UniqueName: \"kubernetes.io/projected/d658460b-438a-46f0-88e1-136741999c81-kube-api-access-rwc75\") pod \"nova-api-0\" (UID: \"d658460b-438a-46f0-88e1-136741999c81\") " pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.577383 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d658460b-438a-46f0-88e1-136741999c81-logs\") pod \"nova-api-0\" (UID: \"d658460b-438a-46f0-88e1-136741999c81\") " pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.577403 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d658460b-438a-46f0-88e1-136741999c81-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d658460b-438a-46f0-88e1-136741999c81\") " pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.578734 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d658460b-438a-46f0-88e1-136741999c81-logs\") pod \"nova-api-0\" (UID: \"d658460b-438a-46f0-88e1-136741999c81\") " pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.582772 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d658460b-438a-46f0-88e1-136741999c81-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d658460b-438a-46f0-88e1-136741999c81\") " pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.588488 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d658460b-438a-46f0-88e1-136741999c81-public-tls-certs\") pod \"nova-api-0\" (UID: \"d658460b-438a-46f0-88e1-136741999c81\") " pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.588599 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d658460b-438a-46f0-88e1-136741999c81-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d658460b-438a-46f0-88e1-136741999c81\") " pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.588997 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d658460b-438a-46f0-88e1-136741999c81-config-data\") pod \"nova-api-0\" (UID: \"d658460b-438a-46f0-88e1-136741999c81\") " pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.597757 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwc75\" (UniqueName: \"kubernetes.io/projected/d658460b-438a-46f0-88e1-136741999c81-kube-api-access-rwc75\") pod \"nova-api-0\" (UID: \"d658460b-438a-46f0-88e1-136741999c81\") " pod="openstack/nova-api-0" Dec 04 10:09:07 crc kubenswrapper[4693]: I1204 10:09:07.715456 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:09:08 crc kubenswrapper[4693]: W1204 10:09:08.182263 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd658460b_438a_46f0_88e1_136741999c81.slice/crio-8b606ee9eae9cd1bf172637c834849ba246cc6521e2609c04f41e1ba6fd175f5 WatchSource:0}: Error finding container 8b606ee9eae9cd1bf172637c834849ba246cc6521e2609c04f41e1ba6fd175f5: Status 404 returned error can't find the container with id 8b606ee9eae9cd1bf172637c834849ba246cc6521e2609c04f41e1ba6fd175f5 Dec 04 10:09:08 crc kubenswrapper[4693]: I1204 10:09:08.191405 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:09:08 crc kubenswrapper[4693]: I1204 10:09:08.270595 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d658460b-438a-46f0-88e1-136741999c81","Type":"ContainerStarted","Data":"8b606ee9eae9cd1bf172637c834849ba246cc6521e2609c04f41e1ba6fd175f5"} Dec 04 10:09:08 crc kubenswrapper[4693]: I1204 10:09:08.476589 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53f4ea25-b785-4d6d-a740-7f0bf798fba0" path="/var/lib/kubelet/pods/53f4ea25-b785-4d6d-a740-7f0bf798fba0/volumes" Dec 04 10:09:08 crc kubenswrapper[4693]: I1204 10:09:08.971298 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 10:09:08 crc kubenswrapper[4693]: I1204 10:09:08.971653 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 10:09:09 crc kubenswrapper[4693]: I1204 10:09:09.281550 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d658460b-438a-46f0-88e1-136741999c81","Type":"ContainerStarted","Data":"ad09270ddd73b8bd524dc3c5d865d6a74a8e53ddaf62730273ad26273695177b"} Dec 04 10:09:09 crc kubenswrapper[4693]: I1204 10:09:09.281594 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d658460b-438a-46f0-88e1-136741999c81","Type":"ContainerStarted","Data":"4dbe0020dd3a86bb936d3d34d8e191e4ab765b8b429abcf98d39eea4bef4d912"} Dec 04 10:09:09 crc kubenswrapper[4693]: I1204 10:09:09.309356 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.309315491 podStartE2EDuration="2.309315491s" podCreationTimestamp="2025-12-04 10:09:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:09:09.302058431 +0000 UTC m=+1595.199652184" watchObservedRunningTime="2025-12-04 10:09:09.309315491 +0000 UTC m=+1595.206909244" Dec 04 10:09:09 crc kubenswrapper[4693]: I1204 10:09:09.622404 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 10:09:13 crc kubenswrapper[4693]: I1204 10:09:13.970516 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 10:09:13 crc kubenswrapper[4693]: I1204 10:09:13.971154 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 10:09:14 crc kubenswrapper[4693]: I1204 10:09:14.622636 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 10:09:14 crc kubenswrapper[4693]: I1204 10:09:14.665429 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 10:09:14 crc kubenswrapper[4693]: I1204 10:09:14.985586 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bc70aae2-116d-4528-8a5a-efab89d7e53b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:09:14 crc kubenswrapper[4693]: I1204 10:09:14.985636 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bc70aae2-116d-4528-8a5a-efab89d7e53b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:09:15 crc kubenswrapper[4693]: I1204 10:09:15.376278 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 10:09:17 crc kubenswrapper[4693]: I1204 10:09:17.716392 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:09:17 crc kubenswrapper[4693]: I1204 10:09:17.717560 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:09:18 crc kubenswrapper[4693]: I1204 10:09:18.728496 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d658460b-438a-46f0-88e1-136741999c81" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.223:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:09:18 crc kubenswrapper[4693]: I1204 10:09:18.729097 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d658460b-438a-46f0-88e1-136741999c81" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.223:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:09:23 crc kubenswrapper[4693]: I1204 10:09:23.528207 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 10:09:23 crc kubenswrapper[4693]: I1204 10:09:23.980515 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 10:09:23 crc kubenswrapper[4693]: I1204 10:09:23.981730 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 10:09:23 crc kubenswrapper[4693]: I1204 10:09:23.989246 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 10:09:24 crc kubenswrapper[4693]: I1204 10:09:24.478212 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 10:09:27 crc kubenswrapper[4693]: I1204 10:09:27.726647 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 10:09:27 crc kubenswrapper[4693]: I1204 10:09:27.727789 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 10:09:27 crc kubenswrapper[4693]: I1204 10:09:27.735308 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 10:09:27 crc kubenswrapper[4693]: I1204 10:09:27.736473 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 10:09:28 crc kubenswrapper[4693]: I1204 10:09:28.522050 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 10:09:28 crc kubenswrapper[4693]: I1204 10:09:28.528160 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 10:09:37 crc kubenswrapper[4693]: I1204 10:09:37.185823 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:09:38 crc kubenswrapper[4693]: I1204 10:09:38.315208 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:09:42 crc kubenswrapper[4693]: I1204 10:09:42.117986 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="2d1a11f6-b003-41f8-a2f1-010d7dae29d4" containerName="rabbitmq" containerID="cri-o://390c00429d227816954ec973d7ff10025ee0a9bfa2b911619ec912362f25606b" gracePeriod=604796 Dec 04 10:09:42 crc kubenswrapper[4693]: I1204 10:09:42.859503 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="0f073022-a55b-4a76-8fbd-92df61f2d38b" containerName="rabbitmq" containerID="cri-o://0306fbcb7588457827f7161802a65836c7ffad8bc8a3fe27d04d7d4c853f8b83" gracePeriod=604796 Dec 04 10:09:48 crc kubenswrapper[4693]: I1204 10:09:48.697849 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="2d1a11f6-b003-41f8-a2f1-010d7dae29d4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.310976 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="0f073022-a55b-4a76-8fbd-92df61f2d38b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.518001 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.525011 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.668850 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0f073022-a55b-4a76-8fbd-92df61f2d38b-pod-info\") pod \"0f073022-a55b-4a76-8fbd-92df61f2d38b\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669268 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-pod-info\") pod \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669314 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0f073022-a55b-4a76-8fbd-92df61f2d38b-erlang-cookie-secret\") pod \"0f073022-a55b-4a76-8fbd-92df61f2d38b\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669354 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-plugins\") pod \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669405 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-plugins-conf\") pod \"0f073022-a55b-4a76-8fbd-92df61f2d38b\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669424 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p6pn\" (UniqueName: \"kubernetes.io/projected/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-kube-api-access-6p6pn\") pod \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669448 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-tls\") pod \"0f073022-a55b-4a76-8fbd-92df61f2d38b\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669481 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-erlang-cookie-secret\") pod \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669526 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-confd\") pod \"0f073022-a55b-4a76-8fbd-92df61f2d38b\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669583 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-config-data\") pod \"0f073022-a55b-4a76-8fbd-92df61f2d38b\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669623 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-config-data\") pod \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669668 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669697 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-tls\") pod \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669716 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"0f073022-a55b-4a76-8fbd-92df61f2d38b\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669746 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-confd\") pod \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669765 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-erlang-cookie\") pod \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669822 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khw8b\" (UniqueName: \"kubernetes.io/projected/0f073022-a55b-4a76-8fbd-92df61f2d38b-kube-api-access-khw8b\") pod \"0f073022-a55b-4a76-8fbd-92df61f2d38b\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669853 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-plugins-conf\") pod \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669909 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-erlang-cookie\") pod \"0f073022-a55b-4a76-8fbd-92df61f2d38b\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669929 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-server-conf\") pod \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\" (UID: \"2d1a11f6-b003-41f8-a2f1-010d7dae29d4\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669948 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-plugins\") pod \"0f073022-a55b-4a76-8fbd-92df61f2d38b\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.669981 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-server-conf\") pod \"0f073022-a55b-4a76-8fbd-92df61f2d38b\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.670663 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2d1a11f6-b003-41f8-a2f1-010d7dae29d4" (UID: "2d1a11f6-b003-41f8-a2f1-010d7dae29d4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.676946 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2d1a11f6-b003-41f8-a2f1-010d7dae29d4" (UID: "2d1a11f6-b003-41f8-a2f1-010d7dae29d4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.683984 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0f073022-a55b-4a76-8fbd-92df61f2d38b" (UID: "0f073022-a55b-4a76-8fbd-92df61f2d38b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.685798 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0f073022-a55b-4a76-8fbd-92df61f2d38b" (UID: "0f073022-a55b-4a76-8fbd-92df61f2d38b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.689440 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0f073022-a55b-4a76-8fbd-92df61f2d38b" (UID: "0f073022-a55b-4a76-8fbd-92df61f2d38b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.690090 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2d1a11f6-b003-41f8-a2f1-010d7dae29d4" (UID: "2d1a11f6-b003-41f8-a2f1-010d7dae29d4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.691320 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "2d1a11f6-b003-41f8-a2f1-010d7dae29d4" (UID: "2d1a11f6-b003-41f8-a2f1-010d7dae29d4"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.691649 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2d1a11f6-b003-41f8-a2f1-010d7dae29d4" (UID: "2d1a11f6-b003-41f8-a2f1-010d7dae29d4"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.691762 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0f073022-a55b-4a76-8fbd-92df61f2d38b-pod-info" (OuterVolumeSpecName: "pod-info") pod "0f073022-a55b-4a76-8fbd-92df61f2d38b" (UID: "0f073022-a55b-4a76-8fbd-92df61f2d38b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.693659 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2d1a11f6-b003-41f8-a2f1-010d7dae29d4" (UID: "2d1a11f6-b003-41f8-a2f1-010d7dae29d4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.693758 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0f073022-a55b-4a76-8fbd-92df61f2d38b" (UID: "0f073022-a55b-4a76-8fbd-92df61f2d38b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.693931 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "0f073022-a55b-4a76-8fbd-92df61f2d38b" (UID: "0f073022-a55b-4a76-8fbd-92df61f2d38b"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.694009 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f073022-a55b-4a76-8fbd-92df61f2d38b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0f073022-a55b-4a76-8fbd-92df61f2d38b" (UID: "0f073022-a55b-4a76-8fbd-92df61f2d38b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.694213 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f073022-a55b-4a76-8fbd-92df61f2d38b-kube-api-access-khw8b" (OuterVolumeSpecName: "kube-api-access-khw8b") pod "0f073022-a55b-4a76-8fbd-92df61f2d38b" (UID: "0f073022-a55b-4a76-8fbd-92df61f2d38b"). InnerVolumeSpecName "kube-api-access-khw8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.694398 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-pod-info" (OuterVolumeSpecName: "pod-info") pod "2d1a11f6-b003-41f8-a2f1-010d7dae29d4" (UID: "2d1a11f6-b003-41f8-a2f1-010d7dae29d4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.695851 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-kube-api-access-6p6pn" (OuterVolumeSpecName: "kube-api-access-6p6pn") pod "2d1a11f6-b003-41f8-a2f1-010d7dae29d4" (UID: "2d1a11f6-b003-41f8-a2f1-010d7dae29d4"). InnerVolumeSpecName "kube-api-access-6p6pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.719412 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-config-data" (OuterVolumeSpecName: "config-data") pod "2d1a11f6-b003-41f8-a2f1-010d7dae29d4" (UID: "2d1a11f6-b003-41f8-a2f1-010d7dae29d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.771155 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-config-data" (OuterVolumeSpecName: "config-data") pod "0f073022-a55b-4a76-8fbd-92df61f2d38b" (UID: "0f073022-a55b-4a76-8fbd-92df61f2d38b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.772545 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-server-conf" (OuterVolumeSpecName: "server-conf") pod "0f073022-a55b-4a76-8fbd-92df61f2d38b" (UID: "0f073022-a55b-4a76-8fbd-92df61f2d38b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.774891 4693 generic.go:334] "Generic (PLEG): container finished" podID="2d1a11f6-b003-41f8-a2f1-010d7dae29d4" containerID="390c00429d227816954ec973d7ff10025ee0a9bfa2b911619ec912362f25606b" exitCode=0 Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.775057 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2d1a11f6-b003-41f8-a2f1-010d7dae29d4","Type":"ContainerDied","Data":"390c00429d227816954ec973d7ff10025ee0a9bfa2b911619ec912362f25606b"} Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.775165 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2d1a11f6-b003-41f8-a2f1-010d7dae29d4","Type":"ContainerDied","Data":"f555143c410f65fd65e90fe83fee572868963d76e090a838126f879bd1cbe4b9"} Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.775242 4693 scope.go:117] "RemoveContainer" containerID="390c00429d227816954ec973d7ff10025ee0a9bfa2b911619ec912362f25606b" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.775669 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785027 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-server-conf\") pod \"0f073022-a55b-4a76-8fbd-92df61f2d38b\" (UID: \"0f073022-a55b-4a76-8fbd-92df61f2d38b\") " Dec 04 10:09:49 crc kubenswrapper[4693]: W1204 10:09:49.785449 4693 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/0f073022-a55b-4a76-8fbd-92df61f2d38b/volumes/kubernetes.io~configmap/server-conf Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785536 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-server-conf" (OuterVolumeSpecName: "server-conf") pod "0f073022-a55b-4a76-8fbd-92df61f2d38b" (UID: "0f073022-a55b-4a76-8fbd-92df61f2d38b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785612 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785642 4693 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785658 4693 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785667 4693 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785680 4693 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785690 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khw8b\" (UniqueName: \"kubernetes.io/projected/0f073022-a55b-4a76-8fbd-92df61f2d38b-kube-api-access-khw8b\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785699 4693 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785708 4693 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785718 4693 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785727 4693 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-server-conf\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785735 4693 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0f073022-a55b-4a76-8fbd-92df61f2d38b-pod-info\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785742 4693 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-pod-info\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785752 4693 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785760 4693 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0f073022-a55b-4a76-8fbd-92df61f2d38b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785768 4693 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785777 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p6pn\" (UniqueName: \"kubernetes.io/projected/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-kube-api-access-6p6pn\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785785 4693 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785794 4693 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.785804 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0f073022-a55b-4a76-8fbd-92df61f2d38b-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.813398 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-server-conf" (OuterVolumeSpecName: "server-conf") pod "2d1a11f6-b003-41f8-a2f1-010d7dae29d4" (UID: "2d1a11f6-b003-41f8-a2f1-010d7dae29d4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.850616 4693 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.862612 4693 scope.go:117] "RemoveContainer" containerID="1b1cc8585e13a887662f01305afa00ea83a4300fcc52b4fcfcf4fc153d0a7e77" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.863040 4693 generic.go:334] "Generic (PLEG): container finished" podID="0f073022-a55b-4a76-8fbd-92df61f2d38b" containerID="0306fbcb7588457827f7161802a65836c7ffad8bc8a3fe27d04d7d4c853f8b83" exitCode=0 Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.863155 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0f073022-a55b-4a76-8fbd-92df61f2d38b","Type":"ContainerDied","Data":"0306fbcb7588457827f7161802a65836c7ffad8bc8a3fe27d04d7d4c853f8b83"} Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.863240 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0f073022-a55b-4a76-8fbd-92df61f2d38b","Type":"ContainerDied","Data":"b9dbe041a9a892af100aa80824488278db9f03cd649ac04fb30c2c5009097bc4"} Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.863423 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.898491 4693 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.898637 4693 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-server-conf\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.924219 4693 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.941784 4693 scope.go:117] "RemoveContainer" containerID="390c00429d227816954ec973d7ff10025ee0a9bfa2b911619ec912362f25606b" Dec 04 10:09:49 crc kubenswrapper[4693]: E1204 10:09:49.958504 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"390c00429d227816954ec973d7ff10025ee0a9bfa2b911619ec912362f25606b\": container with ID starting with 390c00429d227816954ec973d7ff10025ee0a9bfa2b911619ec912362f25606b not found: ID does not exist" containerID="390c00429d227816954ec973d7ff10025ee0a9bfa2b911619ec912362f25606b" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.958554 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"390c00429d227816954ec973d7ff10025ee0a9bfa2b911619ec912362f25606b"} err="failed to get container status \"390c00429d227816954ec973d7ff10025ee0a9bfa2b911619ec912362f25606b\": rpc error: code = NotFound desc = could not find container \"390c00429d227816954ec973d7ff10025ee0a9bfa2b911619ec912362f25606b\": container with ID starting with 390c00429d227816954ec973d7ff10025ee0a9bfa2b911619ec912362f25606b not found: ID does not exist" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.958581 4693 scope.go:117] "RemoveContainer" containerID="1b1cc8585e13a887662f01305afa00ea83a4300fcc52b4fcfcf4fc153d0a7e77" Dec 04 10:09:49 crc kubenswrapper[4693]: E1204 10:09:49.959214 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b1cc8585e13a887662f01305afa00ea83a4300fcc52b4fcfcf4fc153d0a7e77\": container with ID starting with 1b1cc8585e13a887662f01305afa00ea83a4300fcc52b4fcfcf4fc153d0a7e77 not found: ID does not exist" containerID="1b1cc8585e13a887662f01305afa00ea83a4300fcc52b4fcfcf4fc153d0a7e77" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.959288 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b1cc8585e13a887662f01305afa00ea83a4300fcc52b4fcfcf4fc153d0a7e77"} err="failed to get container status \"1b1cc8585e13a887662f01305afa00ea83a4300fcc52b4fcfcf4fc153d0a7e77\": rpc error: code = NotFound desc = could not find container \"1b1cc8585e13a887662f01305afa00ea83a4300fcc52b4fcfcf4fc153d0a7e77\": container with ID starting with 1b1cc8585e13a887662f01305afa00ea83a4300fcc52b4fcfcf4fc153d0a7e77 not found: ID does not exist" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.959324 4693 scope.go:117] "RemoveContainer" containerID="0306fbcb7588457827f7161802a65836c7ffad8bc8a3fe27d04d7d4c853f8b83" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.987962 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0f073022-a55b-4a76-8fbd-92df61f2d38b" (UID: "0f073022-a55b-4a76-8fbd-92df61f2d38b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:09:49 crc kubenswrapper[4693]: I1204 10:09:49.992360 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2d1a11f6-b003-41f8-a2f1-010d7dae29d4" (UID: "2d1a11f6-b003-41f8-a2f1-010d7dae29d4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.039469 4693 scope.go:117] "RemoveContainer" containerID="c109a18000a463f941965facc7321c77423991e9abc308252989053addd9f995" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.043859 4693 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0f073022-a55b-4a76-8fbd-92df61f2d38b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.043890 4693 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.043902 4693 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d1a11f6-b003-41f8-a2f1-010d7dae29d4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.080778 4693 scope.go:117] "RemoveContainer" containerID="0306fbcb7588457827f7161802a65836c7ffad8bc8a3fe27d04d7d4c853f8b83" Dec 04 10:09:50 crc kubenswrapper[4693]: E1204 10:09:50.081712 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0306fbcb7588457827f7161802a65836c7ffad8bc8a3fe27d04d7d4c853f8b83\": container with ID starting with 0306fbcb7588457827f7161802a65836c7ffad8bc8a3fe27d04d7d4c853f8b83 not found: ID does not exist" containerID="0306fbcb7588457827f7161802a65836c7ffad8bc8a3fe27d04d7d4c853f8b83" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.081767 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0306fbcb7588457827f7161802a65836c7ffad8bc8a3fe27d04d7d4c853f8b83"} err="failed to get container status \"0306fbcb7588457827f7161802a65836c7ffad8bc8a3fe27d04d7d4c853f8b83\": rpc error: code = NotFound desc = could not find container \"0306fbcb7588457827f7161802a65836c7ffad8bc8a3fe27d04d7d4c853f8b83\": container with ID starting with 0306fbcb7588457827f7161802a65836c7ffad8bc8a3fe27d04d7d4c853f8b83 not found: ID does not exist" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.081800 4693 scope.go:117] "RemoveContainer" containerID="c109a18000a463f941965facc7321c77423991e9abc308252989053addd9f995" Dec 04 10:09:50 crc kubenswrapper[4693]: E1204 10:09:50.089539 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c109a18000a463f941965facc7321c77423991e9abc308252989053addd9f995\": container with ID starting with c109a18000a463f941965facc7321c77423991e9abc308252989053addd9f995 not found: ID does not exist" containerID="c109a18000a463f941965facc7321c77423991e9abc308252989053addd9f995" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.089676 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c109a18000a463f941965facc7321c77423991e9abc308252989053addd9f995"} err="failed to get container status \"c109a18000a463f941965facc7321c77423991e9abc308252989053addd9f995\": rpc error: code = NotFound desc = could not find container \"c109a18000a463f941965facc7321c77423991e9abc308252989053addd9f995\": container with ID starting with c109a18000a463f941965facc7321c77423991e9abc308252989053addd9f995 not found: ID does not exist" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.122960 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.141403 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.149073 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:09:50 crc kubenswrapper[4693]: E1204 10:09:50.149616 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d1a11f6-b003-41f8-a2f1-010d7dae29d4" containerName="rabbitmq" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.149635 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1a11f6-b003-41f8-a2f1-010d7dae29d4" containerName="rabbitmq" Dec 04 10:09:50 crc kubenswrapper[4693]: E1204 10:09:50.149685 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f073022-a55b-4a76-8fbd-92df61f2d38b" containerName="rabbitmq" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.149693 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f073022-a55b-4a76-8fbd-92df61f2d38b" containerName="rabbitmq" Dec 04 10:09:50 crc kubenswrapper[4693]: E1204 10:09:50.149719 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f073022-a55b-4a76-8fbd-92df61f2d38b" containerName="setup-container" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.149727 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f073022-a55b-4a76-8fbd-92df61f2d38b" containerName="setup-container" Dec 04 10:09:50 crc kubenswrapper[4693]: E1204 10:09:50.149765 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d1a11f6-b003-41f8-a2f1-010d7dae29d4" containerName="setup-container" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.149774 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1a11f6-b003-41f8-a2f1-010d7dae29d4" containerName="setup-container" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.150175 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f073022-a55b-4a76-8fbd-92df61f2d38b" containerName="rabbitmq" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.150498 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d1a11f6-b003-41f8-a2f1-010d7dae29d4" containerName="rabbitmq" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.152213 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.157768 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.157931 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.157958 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.157985 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.157929 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.157943 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.158171 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jzl54" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.192419 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.291678 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.300642 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.327364 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.329387 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.333455 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.333516 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.333526 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.333603 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-r5rkb" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.333616 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.333795 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.334551 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.343225 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.350550 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.350640 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c0a72230-d599-4df6-bd4b-279092bf8861-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.350688 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c0a72230-d599-4df6-bd4b-279092bf8861-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.350716 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c0a72230-d599-4df6-bd4b-279092bf8861-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.350740 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c0a72230-d599-4df6-bd4b-279092bf8861-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.350770 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c0a72230-d599-4df6-bd4b-279092bf8861-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.350812 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0a72230-d599-4df6-bd4b-279092bf8861-config-data\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.350880 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c0a72230-d599-4df6-bd4b-279092bf8861-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.350909 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c0a72230-d599-4df6-bd4b-279092bf8861-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.350948 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c0a72230-d599-4df6-bd4b-279092bf8861-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.350982 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8gk2\" (UniqueName: \"kubernetes.io/projected/c0a72230-d599-4df6-bd4b-279092bf8861-kube-api-access-g8gk2\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.453052 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1ee328e-d29f-4224-913b-bc23195bf2b2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.453106 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1ee328e-d29f-4224-913b-bc23195bf2b2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.453135 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.453158 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1ee328e-d29f-4224-913b-bc23195bf2b2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.453235 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c0a72230-d599-4df6-bd4b-279092bf8861-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.453275 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.453488 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c0a72230-d599-4df6-bd4b-279092bf8861-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.453507 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.453634 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c0a72230-d599-4df6-bd4b-279092bf8861-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.453678 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c0a72230-d599-4df6-bd4b-279092bf8861-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.453717 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj5k8\" (UniqueName: \"kubernetes.io/projected/c1ee328e-d29f-4224-913b-bc23195bf2b2-kube-api-access-sj5k8\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.453841 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c0a72230-d599-4df6-bd4b-279092bf8861-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.453991 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0a72230-d599-4df6-bd4b-279092bf8861-config-data\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.454081 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1ee328e-d29f-4224-913b-bc23195bf2b2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.454127 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1ee328e-d29f-4224-913b-bc23195bf2b2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.454182 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c0a72230-d599-4df6-bd4b-279092bf8861-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.454229 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c0a72230-d599-4df6-bd4b-279092bf8861-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.454295 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c0a72230-d599-4df6-bd4b-279092bf8861-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.454366 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1ee328e-d29f-4224-913b-bc23195bf2b2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.454383 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1ee328e-d29f-4224-913b-bc23195bf2b2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.454401 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1ee328e-d29f-4224-913b-bc23195bf2b2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.454429 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1ee328e-d29f-4224-913b-bc23195bf2b2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.454453 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c0a72230-d599-4df6-bd4b-279092bf8861-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.454469 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8gk2\" (UniqueName: \"kubernetes.io/projected/c0a72230-d599-4df6-bd4b-279092bf8861-kube-api-access-g8gk2\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.454651 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c0a72230-d599-4df6-bd4b-279092bf8861-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.455199 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c0a72230-d599-4df6-bd4b-279092bf8861-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.455742 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0a72230-d599-4df6-bd4b-279092bf8861-config-data\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.456800 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c0a72230-d599-4df6-bd4b-279092bf8861-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.459432 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c0a72230-d599-4df6-bd4b-279092bf8861-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.459502 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c0a72230-d599-4df6-bd4b-279092bf8861-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.459744 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c0a72230-d599-4df6-bd4b-279092bf8861-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.471205 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c0a72230-d599-4df6-bd4b-279092bf8861-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.476606 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f073022-a55b-4a76-8fbd-92df61f2d38b" path="/var/lib/kubelet/pods/0f073022-a55b-4a76-8fbd-92df61f2d38b/volumes" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.477818 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d1a11f6-b003-41f8-a2f1-010d7dae29d4" path="/var/lib/kubelet/pods/2d1a11f6-b003-41f8-a2f1-010d7dae29d4/volumes" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.478224 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8gk2\" (UniqueName: \"kubernetes.io/projected/c0a72230-d599-4df6-bd4b-279092bf8861-kube-api-access-g8gk2\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.515551 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"c0a72230-d599-4df6-bd4b-279092bf8861\") " pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.555952 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1ee328e-d29f-4224-913b-bc23195bf2b2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.556045 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1ee328e-d29f-4224-913b-bc23195bf2b2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.556080 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1ee328e-d29f-4224-913b-bc23195bf2b2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.556241 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1ee328e-d29f-4224-913b-bc23195bf2b2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.556689 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1ee328e-d29f-4224-913b-bc23195bf2b2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.556718 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1ee328e-d29f-4224-913b-bc23195bf2b2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.556748 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1ee328e-d29f-4224-913b-bc23195bf2b2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.556798 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.556844 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj5k8\" (UniqueName: \"kubernetes.io/projected/c1ee328e-d29f-4224-913b-bc23195bf2b2-kube-api-access-sj5k8\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.556911 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1ee328e-d29f-4224-913b-bc23195bf2b2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.556938 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1ee328e-d29f-4224-913b-bc23195bf2b2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.557003 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1ee328e-d29f-4224-913b-bc23195bf2b2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.557898 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.558219 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1ee328e-d29f-4224-913b-bc23195bf2b2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.558355 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1ee328e-d29f-4224-913b-bc23195bf2b2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.558700 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1ee328e-d29f-4224-913b-bc23195bf2b2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.559206 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1ee328e-d29f-4224-913b-bc23195bf2b2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.560690 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1ee328e-d29f-4224-913b-bc23195bf2b2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.573417 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1ee328e-d29f-4224-913b-bc23195bf2b2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.573718 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1ee328e-d29f-4224-913b-bc23195bf2b2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.574102 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1ee328e-d29f-4224-913b-bc23195bf2b2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.578145 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj5k8\" (UniqueName: \"kubernetes.io/projected/c1ee328e-d29f-4224-913b-bc23195bf2b2-kube-api-access-sj5k8\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.578730 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.604795 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1ee328e-d29f-4224-913b-bc23195bf2b2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:50 crc kubenswrapper[4693]: I1204 10:09:50.649715 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:09:51 crc kubenswrapper[4693]: I1204 10:09:51.078659 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:09:51 crc kubenswrapper[4693]: I1204 10:09:51.255359 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:09:51 crc kubenswrapper[4693]: W1204 10:09:51.262523 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1ee328e_d29f_4224_913b_bc23195bf2b2.slice/crio-dde644310f21446ebaaf3a674eb3a8a18922e31c58838419e8a8d27e3b63310a WatchSource:0}: Error finding container dde644310f21446ebaaf3a674eb3a8a18922e31c58838419e8a8d27e3b63310a: Status 404 returned error can't find the container with id dde644310f21446ebaaf3a674eb3a8a18922e31c58838419e8a8d27e3b63310a Dec 04 10:09:51 crc kubenswrapper[4693]: I1204 10:09:51.888847 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c1ee328e-d29f-4224-913b-bc23195bf2b2","Type":"ContainerStarted","Data":"dde644310f21446ebaaf3a674eb3a8a18922e31c58838419e8a8d27e3b63310a"} Dec 04 10:09:51 crc kubenswrapper[4693]: I1204 10:09:51.890162 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c0a72230-d599-4df6-bd4b-279092bf8861","Type":"ContainerStarted","Data":"cae4bc1f27b127bcee03b24dd68a7fdb4af336d9d4558ff98cdfa9bc9be53d82"} Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.273405 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.273471 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.452979 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-696df444c7-czvwj"] Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.483154 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.486074 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.591222 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-696df444c7-czvwj"] Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.598476 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-dns-svc\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.598763 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-ovsdbserver-sb\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.598822 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-ovsdbserver-nb\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.598889 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkhww\" (UniqueName: \"kubernetes.io/projected/987275a8-7f64-4c33-a5a6-d07ada638b6f-kube-api-access-bkhww\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.598918 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-dns-swift-storage-0\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.598970 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-openstack-edpm-ipam\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.598998 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-config\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.700305 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkhww\" (UniqueName: \"kubernetes.io/projected/987275a8-7f64-4c33-a5a6-d07ada638b6f-kube-api-access-bkhww\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.700391 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-dns-swift-storage-0\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.700468 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-openstack-edpm-ipam\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.700500 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-config\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.700550 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-dns-svc\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.700566 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-ovsdbserver-sb\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.700624 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-ovsdbserver-nb\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.701570 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-ovsdbserver-nb\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.701840 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-openstack-edpm-ipam\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.701924 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-config\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.702122 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-dns-svc\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.702457 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-ovsdbserver-sb\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.702685 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-dns-swift-storage-0\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.761259 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkhww\" (UniqueName: \"kubernetes.io/projected/987275a8-7f64-4c33-a5a6-d07ada638b6f-kube-api-access-bkhww\") pod \"dnsmasq-dns-696df444c7-czvwj\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.836680 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:09:52 crc kubenswrapper[4693]: I1204 10:09:52.926934 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c0a72230-d599-4df6-bd4b-279092bf8861","Type":"ContainerStarted","Data":"77a389618637a23a79e2c4c6722400fac12de64d8735f08fe4eb506e62e5cbe0"} Dec 04 10:09:53 crc kubenswrapper[4693]: I1204 10:09:53.353896 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-696df444c7-czvwj"] Dec 04 10:09:53 crc kubenswrapper[4693]: I1204 10:09:53.937415 4693 generic.go:334] "Generic (PLEG): container finished" podID="987275a8-7f64-4c33-a5a6-d07ada638b6f" containerID="926316b857a1552eba745781a281e53f0ccdd7ced705365b9830c967aa5d7235" exitCode=0 Dec 04 10:09:53 crc kubenswrapper[4693]: I1204 10:09:53.937514 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696df444c7-czvwj" event={"ID":"987275a8-7f64-4c33-a5a6-d07ada638b6f","Type":"ContainerDied","Data":"926316b857a1552eba745781a281e53f0ccdd7ced705365b9830c967aa5d7235"} Dec 04 10:09:53 crc kubenswrapper[4693]: I1204 10:09:53.938030 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696df444c7-czvwj" event={"ID":"987275a8-7f64-4c33-a5a6-d07ada638b6f","Type":"ContainerStarted","Data":"cf0a30925055a288bf87cdb7d261a222415e4eb577bb5924ad10e966bed9f1af"} Dec 04 10:09:53 crc kubenswrapper[4693]: I1204 10:09:53.939949 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c1ee328e-d29f-4224-913b-bc23195bf2b2","Type":"ContainerStarted","Data":"adb04669f3faec3a20decfc6aef31e7b3df15f420269b34e4f26d26e42790b1c"} Dec 04 10:09:54 crc kubenswrapper[4693]: I1204 10:09:54.952601 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696df444c7-czvwj" event={"ID":"987275a8-7f64-4c33-a5a6-d07ada638b6f","Type":"ContainerStarted","Data":"a411478f7222ec21c7bbe9a61dd34e0ca9cd59da91f67c531d3ba41759d70c0c"} Dec 04 10:09:54 crc kubenswrapper[4693]: I1204 10:09:54.972986 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-696df444c7-czvwj" podStartSLOduration=2.972965301 podStartE2EDuration="2.972965301s" podCreationTimestamp="2025-12-04 10:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:09:54.971314387 +0000 UTC m=+1640.868908140" watchObservedRunningTime="2025-12-04 10:09:54.972965301 +0000 UTC m=+1640.870559054" Dec 04 10:09:55 crc kubenswrapper[4693]: I1204 10:09:55.959978 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:10:02 crc kubenswrapper[4693]: I1204 10:10:02.839251 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:10:02 crc kubenswrapper[4693]: I1204 10:10:02.911450 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5958d5dc75-r2lpz"] Dec 04 10:10:02 crc kubenswrapper[4693]: I1204 10:10:02.911745 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" podUID="fdebc460-1495-4d6e-9621-3117453dd084" containerName="dnsmasq-dns" containerID="cri-o://0725817d2f68e84f8870049db43ea3f1ccf1b2d1469e70957d39aba6fe1154ed" gracePeriod=10 Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.107962 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b5f8b59f-krd95"] Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.110710 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.138991 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b5f8b59f-krd95"] Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.244162 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60e50add-23a4-48de-a35c-0275bab951b1-ovsdbserver-nb\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.244253 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60e50add-23a4-48de-a35c-0275bab951b1-dns-svc\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.244373 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm2q6\" (UniqueName: \"kubernetes.io/projected/60e50add-23a4-48de-a35c-0275bab951b1-kube-api-access-rm2q6\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.244401 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60e50add-23a4-48de-a35c-0275bab951b1-config\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.244419 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/60e50add-23a4-48de-a35c-0275bab951b1-openstack-edpm-ipam\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.244492 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60e50add-23a4-48de-a35c-0275bab951b1-ovsdbserver-sb\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.244518 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60e50add-23a4-48de-a35c-0275bab951b1-dns-swift-storage-0\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.346732 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm2q6\" (UniqueName: \"kubernetes.io/projected/60e50add-23a4-48de-a35c-0275bab951b1-kube-api-access-rm2q6\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.346789 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60e50add-23a4-48de-a35c-0275bab951b1-config\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.346810 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/60e50add-23a4-48de-a35c-0275bab951b1-openstack-edpm-ipam\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.346852 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60e50add-23a4-48de-a35c-0275bab951b1-ovsdbserver-sb\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.346874 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60e50add-23a4-48de-a35c-0275bab951b1-dns-swift-storage-0\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.346898 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60e50add-23a4-48de-a35c-0275bab951b1-ovsdbserver-nb\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.346957 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60e50add-23a4-48de-a35c-0275bab951b1-dns-svc\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.348169 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/60e50add-23a4-48de-a35c-0275bab951b1-openstack-edpm-ipam\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.351023 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60e50add-23a4-48de-a35c-0275bab951b1-dns-svc\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.351942 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60e50add-23a4-48de-a35c-0275bab951b1-config\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.352535 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60e50add-23a4-48de-a35c-0275bab951b1-dns-swift-storage-0\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.353037 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60e50add-23a4-48de-a35c-0275bab951b1-ovsdbserver-sb\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.353752 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60e50add-23a4-48de-a35c-0275bab951b1-ovsdbserver-nb\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.388663 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm2q6\" (UniqueName: \"kubernetes.io/projected/60e50add-23a4-48de-a35c-0275bab951b1-kube-api-access-rm2q6\") pod \"dnsmasq-dns-84b5f8b59f-krd95\" (UID: \"60e50add-23a4-48de-a35c-0275bab951b1\") " pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.453449 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.601746 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.657155 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc654\" (UniqueName: \"kubernetes.io/projected/fdebc460-1495-4d6e-9621-3117453dd084-kube-api-access-dc654\") pod \"fdebc460-1495-4d6e-9621-3117453dd084\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.657219 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-dns-swift-storage-0\") pod \"fdebc460-1495-4d6e-9621-3117453dd084\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.657347 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-dns-svc\") pod \"fdebc460-1495-4d6e-9621-3117453dd084\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.657420 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-ovsdbserver-nb\") pod \"fdebc460-1495-4d6e-9621-3117453dd084\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.657506 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-ovsdbserver-sb\") pod \"fdebc460-1495-4d6e-9621-3117453dd084\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.657520 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-config\") pod \"fdebc460-1495-4d6e-9621-3117453dd084\" (UID: \"fdebc460-1495-4d6e-9621-3117453dd084\") " Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.664547 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdebc460-1495-4d6e-9621-3117453dd084-kube-api-access-dc654" (OuterVolumeSpecName: "kube-api-access-dc654") pod "fdebc460-1495-4d6e-9621-3117453dd084" (UID: "fdebc460-1495-4d6e-9621-3117453dd084"). InnerVolumeSpecName "kube-api-access-dc654". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.713389 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fdebc460-1495-4d6e-9621-3117453dd084" (UID: "fdebc460-1495-4d6e-9621-3117453dd084"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.722474 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fdebc460-1495-4d6e-9621-3117453dd084" (UID: "fdebc460-1495-4d6e-9621-3117453dd084"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.722592 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fdebc460-1495-4d6e-9621-3117453dd084" (UID: "fdebc460-1495-4d6e-9621-3117453dd084"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.729191 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fdebc460-1495-4d6e-9621-3117453dd084" (UID: "fdebc460-1495-4d6e-9621-3117453dd084"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.729504 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-config" (OuterVolumeSpecName: "config") pod "fdebc460-1495-4d6e-9621-3117453dd084" (UID: "fdebc460-1495-4d6e-9621-3117453dd084"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.766466 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.766558 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.766571 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc654\" (UniqueName: \"kubernetes.io/projected/fdebc460-1495-4d6e-9621-3117453dd084-kube-api-access-dc654\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.766603 4693 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.766627 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.766636 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdebc460-1495-4d6e-9621-3117453dd084-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:03 crc kubenswrapper[4693]: I1204 10:10:03.914056 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b5f8b59f-krd95"] Dec 04 10:10:03 crc kubenswrapper[4693]: W1204 10:10:03.921047 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60e50add_23a4_48de_a35c_0275bab951b1.slice/crio-30dc5644f7e26e992cfa65ddbd1c1f4978999826975430b826bf7b6d870f7eb2 WatchSource:0}: Error finding container 30dc5644f7e26e992cfa65ddbd1c1f4978999826975430b826bf7b6d870f7eb2: Status 404 returned error can't find the container with id 30dc5644f7e26e992cfa65ddbd1c1f4978999826975430b826bf7b6d870f7eb2 Dec 04 10:10:04 crc kubenswrapper[4693]: I1204 10:10:04.043158 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" event={"ID":"60e50add-23a4-48de-a35c-0275bab951b1","Type":"ContainerStarted","Data":"30dc5644f7e26e992cfa65ddbd1c1f4978999826975430b826bf7b6d870f7eb2"} Dec 04 10:10:04 crc kubenswrapper[4693]: I1204 10:10:04.045160 4693 generic.go:334] "Generic (PLEG): container finished" podID="fdebc460-1495-4d6e-9621-3117453dd084" containerID="0725817d2f68e84f8870049db43ea3f1ccf1b2d1469e70957d39aba6fe1154ed" exitCode=0 Dec 04 10:10:04 crc kubenswrapper[4693]: I1204 10:10:04.045206 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" Dec 04 10:10:04 crc kubenswrapper[4693]: I1204 10:10:04.045213 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" event={"ID":"fdebc460-1495-4d6e-9621-3117453dd084","Type":"ContainerDied","Data":"0725817d2f68e84f8870049db43ea3f1ccf1b2d1469e70957d39aba6fe1154ed"} Dec 04 10:10:04 crc kubenswrapper[4693]: I1204 10:10:04.045246 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5958d5dc75-r2lpz" event={"ID":"fdebc460-1495-4d6e-9621-3117453dd084","Type":"ContainerDied","Data":"f26b82d567036128005ed2d0bc3a7a32bc80a44e2b077bc007d2ebdf1bf6d7b6"} Dec 04 10:10:04 crc kubenswrapper[4693]: I1204 10:10:04.045263 4693 scope.go:117] "RemoveContainer" containerID="0725817d2f68e84f8870049db43ea3f1ccf1b2d1469e70957d39aba6fe1154ed" Dec 04 10:10:04 crc kubenswrapper[4693]: I1204 10:10:04.112074 4693 scope.go:117] "RemoveContainer" containerID="a9790b8d5d495a4a88d88c13709e4fc136ed5c87047acd9bd461d69d74684ace" Dec 04 10:10:04 crc kubenswrapper[4693]: I1204 10:10:04.139593 4693 scope.go:117] "RemoveContainer" containerID="0725817d2f68e84f8870049db43ea3f1ccf1b2d1469e70957d39aba6fe1154ed" Dec 04 10:10:04 crc kubenswrapper[4693]: E1204 10:10:04.140028 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0725817d2f68e84f8870049db43ea3f1ccf1b2d1469e70957d39aba6fe1154ed\": container with ID starting with 0725817d2f68e84f8870049db43ea3f1ccf1b2d1469e70957d39aba6fe1154ed not found: ID does not exist" containerID="0725817d2f68e84f8870049db43ea3f1ccf1b2d1469e70957d39aba6fe1154ed" Dec 04 10:10:04 crc kubenswrapper[4693]: I1204 10:10:04.140068 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0725817d2f68e84f8870049db43ea3f1ccf1b2d1469e70957d39aba6fe1154ed"} err="failed to get container status \"0725817d2f68e84f8870049db43ea3f1ccf1b2d1469e70957d39aba6fe1154ed\": rpc error: code = NotFound desc = could not find container \"0725817d2f68e84f8870049db43ea3f1ccf1b2d1469e70957d39aba6fe1154ed\": container with ID starting with 0725817d2f68e84f8870049db43ea3f1ccf1b2d1469e70957d39aba6fe1154ed not found: ID does not exist" Dec 04 10:10:04 crc kubenswrapper[4693]: I1204 10:10:04.140094 4693 scope.go:117] "RemoveContainer" containerID="a9790b8d5d495a4a88d88c13709e4fc136ed5c87047acd9bd461d69d74684ace" Dec 04 10:10:04 crc kubenswrapper[4693]: E1204 10:10:04.140424 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9790b8d5d495a4a88d88c13709e4fc136ed5c87047acd9bd461d69d74684ace\": container with ID starting with a9790b8d5d495a4a88d88c13709e4fc136ed5c87047acd9bd461d69d74684ace not found: ID does not exist" containerID="a9790b8d5d495a4a88d88c13709e4fc136ed5c87047acd9bd461d69d74684ace" Dec 04 10:10:04 crc kubenswrapper[4693]: I1204 10:10:04.140454 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9790b8d5d495a4a88d88c13709e4fc136ed5c87047acd9bd461d69d74684ace"} err="failed to get container status \"a9790b8d5d495a4a88d88c13709e4fc136ed5c87047acd9bd461d69d74684ace\": rpc error: code = NotFound desc = could not find container \"a9790b8d5d495a4a88d88c13709e4fc136ed5c87047acd9bd461d69d74684ace\": container with ID starting with a9790b8d5d495a4a88d88c13709e4fc136ed5c87047acd9bd461d69d74684ace not found: ID does not exist" Dec 04 10:10:04 crc kubenswrapper[4693]: I1204 10:10:04.140833 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5958d5dc75-r2lpz"] Dec 04 10:10:04 crc kubenswrapper[4693]: I1204 10:10:04.150797 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5958d5dc75-r2lpz"] Dec 04 10:10:04 crc kubenswrapper[4693]: I1204 10:10:04.471876 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdebc460-1495-4d6e-9621-3117453dd084" path="/var/lib/kubelet/pods/fdebc460-1495-4d6e-9621-3117453dd084/volumes" Dec 04 10:10:05 crc kubenswrapper[4693]: I1204 10:10:05.057735 4693 generic.go:334] "Generic (PLEG): container finished" podID="60e50add-23a4-48de-a35c-0275bab951b1" containerID="8169af0a3ee837dfbf8b1b801cd8d7ce424fdffe78886f68b66e763f96d8b4b1" exitCode=0 Dec 04 10:10:05 crc kubenswrapper[4693]: I1204 10:10:05.057812 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" event={"ID":"60e50add-23a4-48de-a35c-0275bab951b1","Type":"ContainerDied","Data":"8169af0a3ee837dfbf8b1b801cd8d7ce424fdffe78886f68b66e763f96d8b4b1"} Dec 04 10:10:06 crc kubenswrapper[4693]: I1204 10:10:06.090586 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" event={"ID":"60e50add-23a4-48de-a35c-0275bab951b1","Type":"ContainerStarted","Data":"aa0f25216d92ddca6879ea75d374db56d0f4912ec30ca59b51c49c13ba67d6ee"} Dec 04 10:10:06 crc kubenswrapper[4693]: I1204 10:10:06.091053 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:06 crc kubenswrapper[4693]: I1204 10:10:06.118950 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" podStartSLOduration=3.118921983 podStartE2EDuration="3.118921983s" podCreationTimestamp="2025-12-04 10:10:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:10:06.113379058 +0000 UTC m=+1652.010972821" watchObservedRunningTime="2025-12-04 10:10:06.118921983 +0000 UTC m=+1652.016515736" Dec 04 10:10:07 crc kubenswrapper[4693]: I1204 10:10:07.518551 4693 scope.go:117] "RemoveContainer" containerID="fad32760f70ee8315597994a591cd775d292bf8360d7a15ed9ede88765fdf706" Dec 04 10:10:13 crc kubenswrapper[4693]: I1204 10:10:13.455398 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84b5f8b59f-krd95" Dec 04 10:10:13 crc kubenswrapper[4693]: I1204 10:10:13.518258 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-696df444c7-czvwj"] Dec 04 10:10:13 crc kubenswrapper[4693]: I1204 10:10:13.518572 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-696df444c7-czvwj" podUID="987275a8-7f64-4c33-a5a6-d07ada638b6f" containerName="dnsmasq-dns" containerID="cri-o://a411478f7222ec21c7bbe9a61dd34e0ca9cd59da91f67c531d3ba41759d70c0c" gracePeriod=10 Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.020741 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.090658 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-dns-svc\") pod \"987275a8-7f64-4c33-a5a6-d07ada638b6f\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.090719 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-openstack-edpm-ipam\") pod \"987275a8-7f64-4c33-a5a6-d07ada638b6f\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.090784 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-ovsdbserver-sb\") pod \"987275a8-7f64-4c33-a5a6-d07ada638b6f\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.090839 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkhww\" (UniqueName: \"kubernetes.io/projected/987275a8-7f64-4c33-a5a6-d07ada638b6f-kube-api-access-bkhww\") pod \"987275a8-7f64-4c33-a5a6-d07ada638b6f\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.090900 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-ovsdbserver-nb\") pod \"987275a8-7f64-4c33-a5a6-d07ada638b6f\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.090975 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-config\") pod \"987275a8-7f64-4c33-a5a6-d07ada638b6f\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.091008 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-dns-swift-storage-0\") pod \"987275a8-7f64-4c33-a5a6-d07ada638b6f\" (UID: \"987275a8-7f64-4c33-a5a6-d07ada638b6f\") " Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.100599 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/987275a8-7f64-4c33-a5a6-d07ada638b6f-kube-api-access-bkhww" (OuterVolumeSpecName: "kube-api-access-bkhww") pod "987275a8-7f64-4c33-a5a6-d07ada638b6f" (UID: "987275a8-7f64-4c33-a5a6-d07ada638b6f"). InnerVolumeSpecName "kube-api-access-bkhww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.143894 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-config" (OuterVolumeSpecName: "config") pod "987275a8-7f64-4c33-a5a6-d07ada638b6f" (UID: "987275a8-7f64-4c33-a5a6-d07ada638b6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.146986 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "987275a8-7f64-4c33-a5a6-d07ada638b6f" (UID: "987275a8-7f64-4c33-a5a6-d07ada638b6f"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.150995 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "987275a8-7f64-4c33-a5a6-d07ada638b6f" (UID: "987275a8-7f64-4c33-a5a6-d07ada638b6f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.167973 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "987275a8-7f64-4c33-a5a6-d07ada638b6f" (UID: "987275a8-7f64-4c33-a5a6-d07ada638b6f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.168142 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "987275a8-7f64-4c33-a5a6-d07ada638b6f" (UID: "987275a8-7f64-4c33-a5a6-d07ada638b6f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.172058 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "987275a8-7f64-4c33-a5a6-d07ada638b6f" (UID: "987275a8-7f64-4c33-a5a6-d07ada638b6f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.172445 4693 generic.go:334] "Generic (PLEG): container finished" podID="987275a8-7f64-4c33-a5a6-d07ada638b6f" containerID="a411478f7222ec21c7bbe9a61dd34e0ca9cd59da91f67c531d3ba41759d70c0c" exitCode=0 Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.172510 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696df444c7-czvwj" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.172530 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696df444c7-czvwj" event={"ID":"987275a8-7f64-4c33-a5a6-d07ada638b6f","Type":"ContainerDied","Data":"a411478f7222ec21c7bbe9a61dd34e0ca9cd59da91f67c531d3ba41759d70c0c"} Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.172758 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696df444c7-czvwj" event={"ID":"987275a8-7f64-4c33-a5a6-d07ada638b6f","Type":"ContainerDied","Data":"cf0a30925055a288bf87cdb7d261a222415e4eb577bb5924ad10e966bed9f1af"} Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.172787 4693 scope.go:117] "RemoveContainer" containerID="a411478f7222ec21c7bbe9a61dd34e0ca9cd59da91f67c531d3ba41759d70c0c" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.193510 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.193541 4693 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.193555 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.193564 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkhww\" (UniqueName: \"kubernetes.io/projected/987275a8-7f64-4c33-a5a6-d07ada638b6f-kube-api-access-bkhww\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.193574 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.193582 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.193591 4693 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/987275a8-7f64-4c33-a5a6-d07ada638b6f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.248234 4693 scope.go:117] "RemoveContainer" containerID="926316b857a1552eba745781a281e53f0ccdd7ced705365b9830c967aa5d7235" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.259524 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-696df444c7-czvwj"] Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.274851 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-696df444c7-czvwj"] Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.279166 4693 scope.go:117] "RemoveContainer" containerID="a411478f7222ec21c7bbe9a61dd34e0ca9cd59da91f67c531d3ba41759d70c0c" Dec 04 10:10:14 crc kubenswrapper[4693]: E1204 10:10:14.279738 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a411478f7222ec21c7bbe9a61dd34e0ca9cd59da91f67c531d3ba41759d70c0c\": container with ID starting with a411478f7222ec21c7bbe9a61dd34e0ca9cd59da91f67c531d3ba41759d70c0c not found: ID does not exist" containerID="a411478f7222ec21c7bbe9a61dd34e0ca9cd59da91f67c531d3ba41759d70c0c" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.279793 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a411478f7222ec21c7bbe9a61dd34e0ca9cd59da91f67c531d3ba41759d70c0c"} err="failed to get container status \"a411478f7222ec21c7bbe9a61dd34e0ca9cd59da91f67c531d3ba41759d70c0c\": rpc error: code = NotFound desc = could not find container \"a411478f7222ec21c7bbe9a61dd34e0ca9cd59da91f67c531d3ba41759d70c0c\": container with ID starting with a411478f7222ec21c7bbe9a61dd34e0ca9cd59da91f67c531d3ba41759d70c0c not found: ID does not exist" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.279830 4693 scope.go:117] "RemoveContainer" containerID="926316b857a1552eba745781a281e53f0ccdd7ced705365b9830c967aa5d7235" Dec 04 10:10:14 crc kubenswrapper[4693]: E1204 10:10:14.280161 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"926316b857a1552eba745781a281e53f0ccdd7ced705365b9830c967aa5d7235\": container with ID starting with 926316b857a1552eba745781a281e53f0ccdd7ced705365b9830c967aa5d7235 not found: ID does not exist" containerID="926316b857a1552eba745781a281e53f0ccdd7ced705365b9830c967aa5d7235" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.280189 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"926316b857a1552eba745781a281e53f0ccdd7ced705365b9830c967aa5d7235"} err="failed to get container status \"926316b857a1552eba745781a281e53f0ccdd7ced705365b9830c967aa5d7235\": rpc error: code = NotFound desc = could not find container \"926316b857a1552eba745781a281e53f0ccdd7ced705365b9830c967aa5d7235\": container with ID starting with 926316b857a1552eba745781a281e53f0ccdd7ced705365b9830c967aa5d7235 not found: ID does not exist" Dec 04 10:10:14 crc kubenswrapper[4693]: I1204 10:10:14.509748 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="987275a8-7f64-4c33-a5a6-d07ada638b6f" path="/var/lib/kubelet/pods/987275a8-7f64-4c33-a5a6-d07ada638b6f/volumes" Dec 04 10:10:22 crc kubenswrapper[4693]: I1204 10:10:22.272466 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:10:22 crc kubenswrapper[4693]: I1204 10:10:22.272899 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:10:25 crc kubenswrapper[4693]: I1204 10:10:25.288700 4693 generic.go:334] "Generic (PLEG): container finished" podID="c1ee328e-d29f-4224-913b-bc23195bf2b2" containerID="adb04669f3faec3a20decfc6aef31e7b3df15f420269b34e4f26d26e42790b1c" exitCode=0 Dec 04 10:10:25 crc kubenswrapper[4693]: I1204 10:10:25.288930 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c1ee328e-d29f-4224-913b-bc23195bf2b2","Type":"ContainerDied","Data":"adb04669f3faec3a20decfc6aef31e7b3df15f420269b34e4f26d26e42790b1c"} Dec 04 10:10:25 crc kubenswrapper[4693]: I1204 10:10:25.295726 4693 generic.go:334] "Generic (PLEG): container finished" podID="c0a72230-d599-4df6-bd4b-279092bf8861" containerID="77a389618637a23a79e2c4c6722400fac12de64d8735f08fe4eb506e62e5cbe0" exitCode=0 Dec 04 10:10:25 crc kubenswrapper[4693]: I1204 10:10:25.295820 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c0a72230-d599-4df6-bd4b-279092bf8861","Type":"ContainerDied","Data":"77a389618637a23a79e2c4c6722400fac12de64d8735f08fe4eb506e62e5cbe0"} Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.306956 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c1ee328e-d29f-4224-913b-bc23195bf2b2","Type":"ContainerStarted","Data":"3dbca03ce3e7f90aa27e91c0464a0a141c08fcb96b7ed49c4801ce8b2666109f"} Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.307559 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.310652 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c0a72230-d599-4df6-bd4b-279092bf8861","Type":"ContainerStarted","Data":"05a56e54f88ea80efa7ec474f0238c22a7cf1724d10f96fcc47201afa4f82535"} Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.310875 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.337566 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.337545879 podStartE2EDuration="36.337545879s" podCreationTimestamp="2025-12-04 10:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:10:26.336641065 +0000 UTC m=+1672.234234838" watchObservedRunningTime="2025-12-04 10:10:26.337545879 +0000 UTC m=+1672.235139632" Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.361624 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.361604558 podStartE2EDuration="36.361604558s" podCreationTimestamp="2025-12-04 10:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:10:26.359479602 +0000 UTC m=+1672.257073355" watchObservedRunningTime="2025-12-04 10:10:26.361604558 +0000 UTC m=+1672.259198311" Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.870135 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5"] Dec 04 10:10:26 crc kubenswrapper[4693]: E1204 10:10:26.870943 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdebc460-1495-4d6e-9621-3117453dd084" containerName="init" Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.870966 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdebc460-1495-4d6e-9621-3117453dd084" containerName="init" Dec 04 10:10:26 crc kubenswrapper[4693]: E1204 10:10:26.870987 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987275a8-7f64-4c33-a5a6-d07ada638b6f" containerName="dnsmasq-dns" Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.870996 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="987275a8-7f64-4c33-a5a6-d07ada638b6f" containerName="dnsmasq-dns" Dec 04 10:10:26 crc kubenswrapper[4693]: E1204 10:10:26.871026 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdebc460-1495-4d6e-9621-3117453dd084" containerName="dnsmasq-dns" Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.871034 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdebc460-1495-4d6e-9621-3117453dd084" containerName="dnsmasq-dns" Dec 04 10:10:26 crc kubenswrapper[4693]: E1204 10:10:26.871048 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987275a8-7f64-4c33-a5a6-d07ada638b6f" containerName="init" Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.871055 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="987275a8-7f64-4c33-a5a6-d07ada638b6f" containerName="init" Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.871254 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="987275a8-7f64-4c33-a5a6-d07ada638b6f" containerName="dnsmasq-dns" Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.871290 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdebc460-1495-4d6e-9621-3117453dd084" containerName="dnsmasq-dns" Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.871961 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.874488 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.875281 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzd9z" Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.875403 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.880777 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.903605 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5"] Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.959015 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5\" (UID: \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.959122 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5\" (UID: \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.959148 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5c47\" (UniqueName: \"kubernetes.io/projected/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-kube-api-access-j5c47\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5\" (UID: \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" Dec 04 10:10:26 crc kubenswrapper[4693]: I1204 10:10:26.959262 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5\" (UID: \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" Dec 04 10:10:27 crc kubenswrapper[4693]: I1204 10:10:27.061604 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5\" (UID: \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" Dec 04 10:10:27 crc kubenswrapper[4693]: I1204 10:10:27.061728 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5\" (UID: \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" Dec 04 10:10:27 crc kubenswrapper[4693]: I1204 10:10:27.061773 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5\" (UID: \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" Dec 04 10:10:27 crc kubenswrapper[4693]: I1204 10:10:27.061791 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5c47\" (UniqueName: \"kubernetes.io/projected/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-kube-api-access-j5c47\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5\" (UID: \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" Dec 04 10:10:27 crc kubenswrapper[4693]: I1204 10:10:27.068139 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5\" (UID: \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" Dec 04 10:10:27 crc kubenswrapper[4693]: I1204 10:10:27.069150 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5\" (UID: \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" Dec 04 10:10:27 crc kubenswrapper[4693]: I1204 10:10:27.075820 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5\" (UID: \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" Dec 04 10:10:27 crc kubenswrapper[4693]: I1204 10:10:27.080216 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5c47\" (UniqueName: \"kubernetes.io/projected/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-kube-api-access-j5c47\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5\" (UID: \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" Dec 04 10:10:27 crc kubenswrapper[4693]: I1204 10:10:27.189470 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" Dec 04 10:10:27 crc kubenswrapper[4693]: I1204 10:10:27.837382 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5"] Dec 04 10:10:27 crc kubenswrapper[4693]: I1204 10:10:27.854589 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:10:28 crc kubenswrapper[4693]: I1204 10:10:28.352593 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" event={"ID":"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288","Type":"ContainerStarted","Data":"eb822529e1930ffbd42e59e855edf58db139419a8a8b59ef2859a2e0d4301341"} Dec 04 10:10:37 crc kubenswrapper[4693]: I1204 10:10:37.867870 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:10:38 crc kubenswrapper[4693]: I1204 10:10:38.492153 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" event={"ID":"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288","Type":"ContainerStarted","Data":"beab8f7a143f2c9846357c891fefa4485ca50794dd327d4ef557fcf862b8f70a"} Dec 04 10:10:38 crc kubenswrapper[4693]: I1204 10:10:38.508141 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" podStartSLOduration=2.496968132 podStartE2EDuration="12.508119072s" podCreationTimestamp="2025-12-04 10:10:26 +0000 UTC" firstStartedPulling="2025-12-04 10:10:27.854224366 +0000 UTC m=+1673.751818119" lastFinishedPulling="2025-12-04 10:10:37.865375306 +0000 UTC m=+1683.762969059" observedRunningTime="2025-12-04 10:10:38.500576865 +0000 UTC m=+1684.398170648" watchObservedRunningTime="2025-12-04 10:10:38.508119072 +0000 UTC m=+1684.405712825" Dec 04 10:10:40 crc kubenswrapper[4693]: I1204 10:10:40.582503 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 04 10:10:40 crc kubenswrapper[4693]: I1204 10:10:40.711493 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:10:51 crc kubenswrapper[4693]: I1204 10:10:51.645074 4693 generic.go:334] "Generic (PLEG): container finished" podID="21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288" containerID="beab8f7a143f2c9846357c891fefa4485ca50794dd327d4ef557fcf862b8f70a" exitCode=0 Dec 04 10:10:51 crc kubenswrapper[4693]: I1204 10:10:51.645167 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" event={"ID":"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288","Type":"ContainerDied","Data":"beab8f7a143f2c9846357c891fefa4485ca50794dd327d4ef557fcf862b8f70a"} Dec 04 10:10:52 crc kubenswrapper[4693]: I1204 10:10:52.272320 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:10:52 crc kubenswrapper[4693]: I1204 10:10:52.272675 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:10:52 crc kubenswrapper[4693]: I1204 10:10:52.272724 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 10:10:52 crc kubenswrapper[4693]: I1204 10:10:52.273525 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:10:52 crc kubenswrapper[4693]: I1204 10:10:52.273583 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" gracePeriod=600 Dec 04 10:10:52 crc kubenswrapper[4693]: E1204 10:10:52.400986 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:10:52 crc kubenswrapper[4693]: I1204 10:10:52.657046 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" exitCode=0 Dec 04 10:10:52 crc kubenswrapper[4693]: I1204 10:10:52.657262 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316"} Dec 04 10:10:52 crc kubenswrapper[4693]: I1204 10:10:52.657301 4693 scope.go:117] "RemoveContainer" containerID="b7bd03640b7e4a33a647c5d1603e98e993284c3b724300f1b3ae4227fa75eb8c" Dec 04 10:10:52 crc kubenswrapper[4693]: I1204 10:10:52.658019 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:10:52 crc kubenswrapper[4693]: E1204 10:10:52.658310 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.083137 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.241606 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-ssh-key\") pod \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\" (UID: \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\") " Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.241679 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-inventory\") pod \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\" (UID: \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\") " Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.241869 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-repo-setup-combined-ca-bundle\") pod \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\" (UID: \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\") " Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.241926 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5c47\" (UniqueName: \"kubernetes.io/projected/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-kube-api-access-j5c47\") pod \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\" (UID: \"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288\") " Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.247173 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288" (UID: "21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.247665 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-kube-api-access-j5c47" (OuterVolumeSpecName: "kube-api-access-j5c47") pod "21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288" (UID: "21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288"). InnerVolumeSpecName "kube-api-access-j5c47". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.273570 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-inventory" (OuterVolumeSpecName: "inventory") pod "21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288" (UID: "21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.274083 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288" (UID: "21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.344918 4693 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.344961 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5c47\" (UniqueName: \"kubernetes.io/projected/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-kube-api-access-j5c47\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.344974 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.344989 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.672654 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" event={"ID":"21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288","Type":"ContainerDied","Data":"eb822529e1930ffbd42e59e855edf58db139419a8a8b59ef2859a2e0d4301341"} Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.672999 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb822529e1930ffbd42e59e855edf58db139419a8a8b59ef2859a2e0d4301341" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.672770 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.792858 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9"] Dec 04 10:10:53 crc kubenswrapper[4693]: E1204 10:10:53.793312 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.793357 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.793620 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.794462 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.796615 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzd9z" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.797443 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.797518 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.797582 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.801866 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9"] Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.956413 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80be6b5e-e208-4c31-a663-4a01f460ea18-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ksdq9\" (UID: \"80be6b5e-e208-4c31-a663-4a01f460ea18\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.956465 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80be6b5e-e208-4c31-a663-4a01f460ea18-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ksdq9\" (UID: \"80be6b5e-e208-4c31-a663-4a01f460ea18\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9" Dec 04 10:10:53 crc kubenswrapper[4693]: I1204 10:10:53.956583 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7flg8\" (UniqueName: \"kubernetes.io/projected/80be6b5e-e208-4c31-a663-4a01f460ea18-kube-api-access-7flg8\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ksdq9\" (UID: \"80be6b5e-e208-4c31-a663-4a01f460ea18\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9" Dec 04 10:10:54 crc kubenswrapper[4693]: I1204 10:10:54.059679 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7flg8\" (UniqueName: \"kubernetes.io/projected/80be6b5e-e208-4c31-a663-4a01f460ea18-kube-api-access-7flg8\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ksdq9\" (UID: \"80be6b5e-e208-4c31-a663-4a01f460ea18\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9" Dec 04 10:10:54 crc kubenswrapper[4693]: I1204 10:10:54.059789 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80be6b5e-e208-4c31-a663-4a01f460ea18-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ksdq9\" (UID: \"80be6b5e-e208-4c31-a663-4a01f460ea18\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9" Dec 04 10:10:54 crc kubenswrapper[4693]: I1204 10:10:54.059815 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80be6b5e-e208-4c31-a663-4a01f460ea18-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ksdq9\" (UID: \"80be6b5e-e208-4c31-a663-4a01f460ea18\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9" Dec 04 10:10:54 crc kubenswrapper[4693]: I1204 10:10:54.065362 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80be6b5e-e208-4c31-a663-4a01f460ea18-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ksdq9\" (UID: \"80be6b5e-e208-4c31-a663-4a01f460ea18\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9" Dec 04 10:10:54 crc kubenswrapper[4693]: I1204 10:10:54.066252 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80be6b5e-e208-4c31-a663-4a01f460ea18-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ksdq9\" (UID: \"80be6b5e-e208-4c31-a663-4a01f460ea18\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9" Dec 04 10:10:54 crc kubenswrapper[4693]: I1204 10:10:54.081071 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7flg8\" (UniqueName: \"kubernetes.io/projected/80be6b5e-e208-4c31-a663-4a01f460ea18-kube-api-access-7flg8\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ksdq9\" (UID: \"80be6b5e-e208-4c31-a663-4a01f460ea18\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9" Dec 04 10:10:54 crc kubenswrapper[4693]: I1204 10:10:54.114717 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9" Dec 04 10:10:54 crc kubenswrapper[4693]: I1204 10:10:54.642968 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9"] Dec 04 10:10:54 crc kubenswrapper[4693]: W1204 10:10:54.645000 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80be6b5e_e208_4c31_a663_4a01f460ea18.slice/crio-2d0d2ec0a6ff0766d85ea2194c816ab5310e1b6b455be87c8ed62be42e65a72b WatchSource:0}: Error finding container 2d0d2ec0a6ff0766d85ea2194c816ab5310e1b6b455be87c8ed62be42e65a72b: Status 404 returned error can't find the container with id 2d0d2ec0a6ff0766d85ea2194c816ab5310e1b6b455be87c8ed62be42e65a72b Dec 04 10:10:54 crc kubenswrapper[4693]: I1204 10:10:54.715132 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9" event={"ID":"80be6b5e-e208-4c31-a663-4a01f460ea18","Type":"ContainerStarted","Data":"2d0d2ec0a6ff0766d85ea2194c816ab5310e1b6b455be87c8ed62be42e65a72b"} Dec 04 10:10:55 crc kubenswrapper[4693]: I1204 10:10:55.725654 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9" event={"ID":"80be6b5e-e208-4c31-a663-4a01f460ea18","Type":"ContainerStarted","Data":"35063e66f2ead0a4be7a7c8b6c2be8d6ccec5f0ba5d3edcb05a1b366eeac0649"} Dec 04 10:10:55 crc kubenswrapper[4693]: I1204 10:10:55.743564 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9" podStartSLOduration=2.3192490550000002 podStartE2EDuration="2.743542599s" podCreationTimestamp="2025-12-04 10:10:53 +0000 UTC" firstStartedPulling="2025-12-04 10:10:54.647809918 +0000 UTC m=+1700.545403671" lastFinishedPulling="2025-12-04 10:10:55.072103462 +0000 UTC m=+1700.969697215" observedRunningTime="2025-12-04 10:10:55.738665761 +0000 UTC m=+1701.636259514" watchObservedRunningTime="2025-12-04 10:10:55.743542599 +0000 UTC m=+1701.641136352" Dec 04 10:10:58 crc kubenswrapper[4693]: I1204 10:10:58.753548 4693 generic.go:334] "Generic (PLEG): container finished" podID="80be6b5e-e208-4c31-a663-4a01f460ea18" containerID="35063e66f2ead0a4be7a7c8b6c2be8d6ccec5f0ba5d3edcb05a1b366eeac0649" exitCode=0 Dec 04 10:10:58 crc kubenswrapper[4693]: I1204 10:10:58.753643 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9" event={"ID":"80be6b5e-e208-4c31-a663-4a01f460ea18","Type":"ContainerDied","Data":"35063e66f2ead0a4be7a7c8b6c2be8d6ccec5f0ba5d3edcb05a1b366eeac0649"} Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.137553 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9" Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.293425 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80be6b5e-e208-4c31-a663-4a01f460ea18-ssh-key\") pod \"80be6b5e-e208-4c31-a663-4a01f460ea18\" (UID: \"80be6b5e-e208-4c31-a663-4a01f460ea18\") " Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.293502 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7flg8\" (UniqueName: \"kubernetes.io/projected/80be6b5e-e208-4c31-a663-4a01f460ea18-kube-api-access-7flg8\") pod \"80be6b5e-e208-4c31-a663-4a01f460ea18\" (UID: \"80be6b5e-e208-4c31-a663-4a01f460ea18\") " Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.293635 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80be6b5e-e208-4c31-a663-4a01f460ea18-inventory\") pod \"80be6b5e-e208-4c31-a663-4a01f460ea18\" (UID: \"80be6b5e-e208-4c31-a663-4a01f460ea18\") " Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.298984 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80be6b5e-e208-4c31-a663-4a01f460ea18-kube-api-access-7flg8" (OuterVolumeSpecName: "kube-api-access-7flg8") pod "80be6b5e-e208-4c31-a663-4a01f460ea18" (UID: "80be6b5e-e208-4c31-a663-4a01f460ea18"). InnerVolumeSpecName "kube-api-access-7flg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.325905 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80be6b5e-e208-4c31-a663-4a01f460ea18-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "80be6b5e-e208-4c31-a663-4a01f460ea18" (UID: "80be6b5e-e208-4c31-a663-4a01f460ea18"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.328568 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80be6b5e-e208-4c31-a663-4a01f460ea18-inventory" (OuterVolumeSpecName: "inventory") pod "80be6b5e-e208-4c31-a663-4a01f460ea18" (UID: "80be6b5e-e208-4c31-a663-4a01f460ea18"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.395547 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7flg8\" (UniqueName: \"kubernetes.io/projected/80be6b5e-e208-4c31-a663-4a01f460ea18-kube-api-access-7flg8\") on node \"crc\" DevicePath \"\"" Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.395580 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80be6b5e-e208-4c31-a663-4a01f460ea18-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.395589 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80be6b5e-e208-4c31-a663-4a01f460ea18-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.774932 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9" event={"ID":"80be6b5e-e208-4c31-a663-4a01f460ea18","Type":"ContainerDied","Data":"2d0d2ec0a6ff0766d85ea2194c816ab5310e1b6b455be87c8ed62be42e65a72b"} Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.774976 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d0d2ec0a6ff0766d85ea2194c816ab5310e1b6b455be87c8ed62be42e65a72b" Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.774991 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ksdq9" Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.877378 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv"] Dec 04 10:11:00 crc kubenswrapper[4693]: E1204 10:11:00.877990 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80be6b5e-e208-4c31-a663-4a01f460ea18" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.878009 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="80be6b5e-e208-4c31-a663-4a01f460ea18" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.878241 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="80be6b5e-e208-4c31-a663-4a01f460ea18" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.879191 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.882174 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzd9z" Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.882581 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.884640 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.884873 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:11:00 crc kubenswrapper[4693]: I1204 10:11:00.891932 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv"] Dec 04 10:11:01 crc kubenswrapper[4693]: I1204 10:11:01.009082 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txkh5\" (UniqueName: \"kubernetes.io/projected/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-kube-api-access-txkh5\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv\" (UID: \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" Dec 04 10:11:01 crc kubenswrapper[4693]: I1204 10:11:01.009167 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv\" (UID: \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" Dec 04 10:11:01 crc kubenswrapper[4693]: I1204 10:11:01.009261 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv\" (UID: \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" Dec 04 10:11:01 crc kubenswrapper[4693]: I1204 10:11:01.009321 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv\" (UID: \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" Dec 04 10:11:01 crc kubenswrapper[4693]: I1204 10:11:01.112883 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv\" (UID: \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" Dec 04 10:11:01 crc kubenswrapper[4693]: I1204 10:11:01.113068 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv\" (UID: \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" Dec 04 10:11:01 crc kubenswrapper[4693]: I1204 10:11:01.113251 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txkh5\" (UniqueName: \"kubernetes.io/projected/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-kube-api-access-txkh5\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv\" (UID: \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" Dec 04 10:11:01 crc kubenswrapper[4693]: I1204 10:11:01.113389 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv\" (UID: \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" Dec 04 10:11:01 crc kubenswrapper[4693]: I1204 10:11:01.122876 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv\" (UID: \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" Dec 04 10:11:01 crc kubenswrapper[4693]: I1204 10:11:01.131363 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv\" (UID: \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" Dec 04 10:11:01 crc kubenswrapper[4693]: I1204 10:11:01.147068 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv\" (UID: \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" Dec 04 10:11:01 crc kubenswrapper[4693]: I1204 10:11:01.147099 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txkh5\" (UniqueName: \"kubernetes.io/projected/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-kube-api-access-txkh5\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv\" (UID: \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" Dec 04 10:11:01 crc kubenswrapper[4693]: I1204 10:11:01.215861 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" Dec 04 10:11:01 crc kubenswrapper[4693]: I1204 10:11:01.585384 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv"] Dec 04 10:11:01 crc kubenswrapper[4693]: W1204 10:11:01.587751 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd60e9f3_ac52_4a2b_9e3b_80720e7634ab.slice/crio-f1b812916ebb634a43b201a1251d9935e4bc116451132abe7b7ff94e535f4f45 WatchSource:0}: Error finding container f1b812916ebb634a43b201a1251d9935e4bc116451132abe7b7ff94e535f4f45: Status 404 returned error can't find the container with id f1b812916ebb634a43b201a1251d9935e4bc116451132abe7b7ff94e535f4f45 Dec 04 10:11:01 crc kubenswrapper[4693]: I1204 10:11:01.793720 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" event={"ID":"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab","Type":"ContainerStarted","Data":"f1b812916ebb634a43b201a1251d9935e4bc116451132abe7b7ff94e535f4f45"} Dec 04 10:11:02 crc kubenswrapper[4693]: I1204 10:11:02.805311 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" event={"ID":"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab","Type":"ContainerStarted","Data":"42e449664459b8264e78dc035d27f909016923b575457835a074c683ab9c0b34"} Dec 04 10:11:02 crc kubenswrapper[4693]: I1204 10:11:02.826797 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" podStartSLOduration=2.350361428 podStartE2EDuration="2.826775714s" podCreationTimestamp="2025-12-04 10:11:00 +0000 UTC" firstStartedPulling="2025-12-04 10:11:01.589744209 +0000 UTC m=+1707.487337972" lastFinishedPulling="2025-12-04 10:11:02.066158495 +0000 UTC m=+1707.963752258" observedRunningTime="2025-12-04 10:11:02.821729262 +0000 UTC m=+1708.719323015" watchObservedRunningTime="2025-12-04 10:11:02.826775714 +0000 UTC m=+1708.724369467" Dec 04 10:11:03 crc kubenswrapper[4693]: I1204 10:11:03.461180 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:11:03 crc kubenswrapper[4693]: E1204 10:11:03.461831 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:11:07 crc kubenswrapper[4693]: I1204 10:11:07.698195 4693 scope.go:117] "RemoveContainer" containerID="fa9176a09b30d68f093d6dca03844f9bce4164b1818bfc5bd839a618c29b688c" Dec 04 10:11:14 crc kubenswrapper[4693]: I1204 10:11:14.470852 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:11:14 crc kubenswrapper[4693]: E1204 10:11:14.471695 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:11:25 crc kubenswrapper[4693]: I1204 10:11:25.462693 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:11:25 crc kubenswrapper[4693]: E1204 10:11:25.463597 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:11:37 crc kubenswrapper[4693]: I1204 10:11:37.461205 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:11:37 crc kubenswrapper[4693]: E1204 10:11:37.461954 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:11:50 crc kubenswrapper[4693]: I1204 10:11:50.461206 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:11:50 crc kubenswrapper[4693]: E1204 10:11:50.462000 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:12:01 crc kubenswrapper[4693]: I1204 10:12:01.461549 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:12:01 crc kubenswrapper[4693]: E1204 10:12:01.463257 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:12:07 crc kubenswrapper[4693]: I1204 10:12:07.784875 4693 scope.go:117] "RemoveContainer" containerID="4c5e19c3b9d3c1e1b8b40d70d16b050682e30ec966883be7ebb5bf627f58c55b" Dec 04 10:12:07 crc kubenswrapper[4693]: I1204 10:12:07.811621 4693 scope.go:117] "RemoveContainer" containerID="4392f0b7049337839504d250e6ebc64937035f7589a6309cf39ca4378181567c" Dec 04 10:12:13 crc kubenswrapper[4693]: I1204 10:12:13.461255 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:12:13 crc kubenswrapper[4693]: E1204 10:12:13.462590 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:12:25 crc kubenswrapper[4693]: I1204 10:12:25.462695 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:12:25 crc kubenswrapper[4693]: E1204 10:12:25.463467 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:12:40 crc kubenswrapper[4693]: I1204 10:12:40.461771 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:12:40 crc kubenswrapper[4693]: E1204 10:12:40.462594 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:12:54 crc kubenswrapper[4693]: I1204 10:12:54.468719 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:12:54 crc kubenswrapper[4693]: E1204 10:12:54.469699 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:12:56 crc kubenswrapper[4693]: I1204 10:12:56.119503 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9jlg9"] Dec 04 10:12:56 crc kubenswrapper[4693]: I1204 10:12:56.151442 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9jlg9"] Dec 04 10:12:56 crc kubenswrapper[4693]: I1204 10:12:56.151614 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jlg9" Dec 04 10:12:56 crc kubenswrapper[4693]: I1204 10:12:56.244539 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3058aa9f-c978-40f1-842f-e4730e695c81-catalog-content\") pod \"redhat-operators-9jlg9\" (UID: \"3058aa9f-c978-40f1-842f-e4730e695c81\") " pod="openshift-marketplace/redhat-operators-9jlg9" Dec 04 10:12:56 crc kubenswrapper[4693]: I1204 10:12:56.244616 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ndp9\" (UniqueName: \"kubernetes.io/projected/3058aa9f-c978-40f1-842f-e4730e695c81-kube-api-access-6ndp9\") pod \"redhat-operators-9jlg9\" (UID: \"3058aa9f-c978-40f1-842f-e4730e695c81\") " pod="openshift-marketplace/redhat-operators-9jlg9" Dec 04 10:12:56 crc kubenswrapper[4693]: I1204 10:12:56.244750 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3058aa9f-c978-40f1-842f-e4730e695c81-utilities\") pod \"redhat-operators-9jlg9\" (UID: \"3058aa9f-c978-40f1-842f-e4730e695c81\") " pod="openshift-marketplace/redhat-operators-9jlg9" Dec 04 10:12:56 crc kubenswrapper[4693]: I1204 10:12:56.346424 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3058aa9f-c978-40f1-842f-e4730e695c81-utilities\") pod \"redhat-operators-9jlg9\" (UID: \"3058aa9f-c978-40f1-842f-e4730e695c81\") " pod="openshift-marketplace/redhat-operators-9jlg9" Dec 04 10:12:56 crc kubenswrapper[4693]: I1204 10:12:56.346578 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3058aa9f-c978-40f1-842f-e4730e695c81-catalog-content\") pod \"redhat-operators-9jlg9\" (UID: \"3058aa9f-c978-40f1-842f-e4730e695c81\") " pod="openshift-marketplace/redhat-operators-9jlg9" Dec 04 10:12:56 crc kubenswrapper[4693]: I1204 10:12:56.346615 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ndp9\" (UniqueName: \"kubernetes.io/projected/3058aa9f-c978-40f1-842f-e4730e695c81-kube-api-access-6ndp9\") pod \"redhat-operators-9jlg9\" (UID: \"3058aa9f-c978-40f1-842f-e4730e695c81\") " pod="openshift-marketplace/redhat-operators-9jlg9" Dec 04 10:12:56 crc kubenswrapper[4693]: I1204 10:12:56.347097 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3058aa9f-c978-40f1-842f-e4730e695c81-catalog-content\") pod \"redhat-operators-9jlg9\" (UID: \"3058aa9f-c978-40f1-842f-e4730e695c81\") " pod="openshift-marketplace/redhat-operators-9jlg9" Dec 04 10:12:56 crc kubenswrapper[4693]: I1204 10:12:56.347102 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3058aa9f-c978-40f1-842f-e4730e695c81-utilities\") pod \"redhat-operators-9jlg9\" (UID: \"3058aa9f-c978-40f1-842f-e4730e695c81\") " pod="openshift-marketplace/redhat-operators-9jlg9" Dec 04 10:12:56 crc kubenswrapper[4693]: I1204 10:12:56.375153 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ndp9\" (UniqueName: \"kubernetes.io/projected/3058aa9f-c978-40f1-842f-e4730e695c81-kube-api-access-6ndp9\") pod \"redhat-operators-9jlg9\" (UID: \"3058aa9f-c978-40f1-842f-e4730e695c81\") " pod="openshift-marketplace/redhat-operators-9jlg9" Dec 04 10:12:56 crc kubenswrapper[4693]: I1204 10:12:56.487968 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jlg9" Dec 04 10:12:56 crc kubenswrapper[4693]: I1204 10:12:56.948615 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9jlg9"] Dec 04 10:12:57 crc kubenswrapper[4693]: I1204 10:12:57.002931 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jlg9" event={"ID":"3058aa9f-c978-40f1-842f-e4730e695c81","Type":"ContainerStarted","Data":"e6220f420c9d47fbea8cbded1ae69ac1226ceeb2efac454b77be1d20f0049ae3"} Dec 04 10:12:58 crc kubenswrapper[4693]: I1204 10:12:58.017147 4693 generic.go:334] "Generic (PLEG): container finished" podID="3058aa9f-c978-40f1-842f-e4730e695c81" containerID="94a5e7aa194517a2789e799f59dc1d2cdeb160c0695ca7dd7bb5af83e51d0771" exitCode=0 Dec 04 10:12:58 crc kubenswrapper[4693]: I1204 10:12:58.017324 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jlg9" event={"ID":"3058aa9f-c978-40f1-842f-e4730e695c81","Type":"ContainerDied","Data":"94a5e7aa194517a2789e799f59dc1d2cdeb160c0695ca7dd7bb5af83e51d0771"} Dec 04 10:13:02 crc kubenswrapper[4693]: I1204 10:13:02.055249 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jlg9" event={"ID":"3058aa9f-c978-40f1-842f-e4730e695c81","Type":"ContainerStarted","Data":"ace1ab8e081faa7e0e2b62990888e76d6aa420cc2995b3addb109423a300f5d6"} Dec 04 10:13:03 crc kubenswrapper[4693]: I1204 10:13:03.070041 4693 generic.go:334] "Generic (PLEG): container finished" podID="3058aa9f-c978-40f1-842f-e4730e695c81" containerID="ace1ab8e081faa7e0e2b62990888e76d6aa420cc2995b3addb109423a300f5d6" exitCode=0 Dec 04 10:13:03 crc kubenswrapper[4693]: I1204 10:13:03.070088 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jlg9" event={"ID":"3058aa9f-c978-40f1-842f-e4730e695c81","Type":"ContainerDied","Data":"ace1ab8e081faa7e0e2b62990888e76d6aa420cc2995b3addb109423a300f5d6"} Dec 04 10:13:06 crc kubenswrapper[4693]: I1204 10:13:06.462117 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:13:06 crc kubenswrapper[4693]: E1204 10:13:06.463391 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:13:07 crc kubenswrapper[4693]: I1204 10:13:07.865563 4693 scope.go:117] "RemoveContainer" containerID="19d3deca871c730ec7f45a785c188a9d62294f067f38dbf4ca998260fda6d6d1" Dec 04 10:13:07 crc kubenswrapper[4693]: I1204 10:13:07.885748 4693 scope.go:117] "RemoveContainer" containerID="c1a65b68919f6b99e0d561363427d858c224a65bbd13ca1d92f6fdc533a4dbee" Dec 04 10:13:08 crc kubenswrapper[4693]: I1204 10:13:08.114157 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jlg9" event={"ID":"3058aa9f-c978-40f1-842f-e4730e695c81","Type":"ContainerStarted","Data":"2e9ea34fe6a88daaca3fbb6437ceac2f0f1ad45b4bef4d3d1116d3e150321b0b"} Dec 04 10:13:08 crc kubenswrapper[4693]: I1204 10:13:08.140384 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9jlg9" podStartSLOduration=2.598842787 podStartE2EDuration="12.140366845s" podCreationTimestamp="2025-12-04 10:12:56 +0000 UTC" firstStartedPulling="2025-12-04 10:12:58.020899545 +0000 UTC m=+1823.918493308" lastFinishedPulling="2025-12-04 10:13:07.562423613 +0000 UTC m=+1833.460017366" observedRunningTime="2025-12-04 10:13:08.134165764 +0000 UTC m=+1834.031759518" watchObservedRunningTime="2025-12-04 10:13:08.140366845 +0000 UTC m=+1834.037960598" Dec 04 10:13:16 crc kubenswrapper[4693]: I1204 10:13:16.488982 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9jlg9" Dec 04 10:13:16 crc kubenswrapper[4693]: I1204 10:13:16.489606 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9jlg9" Dec 04 10:13:16 crc kubenswrapper[4693]: I1204 10:13:16.537170 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9jlg9" Dec 04 10:13:17 crc kubenswrapper[4693]: I1204 10:13:17.261531 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9jlg9" Dec 04 10:13:17 crc kubenswrapper[4693]: I1204 10:13:17.315619 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9jlg9"] Dec 04 10:13:19 crc kubenswrapper[4693]: I1204 10:13:19.224380 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9jlg9" podUID="3058aa9f-c978-40f1-842f-e4730e695c81" containerName="registry-server" containerID="cri-o://2e9ea34fe6a88daaca3fbb6437ceac2f0f1ad45b4bef4d3d1116d3e150321b0b" gracePeriod=2 Dec 04 10:13:19 crc kubenswrapper[4693]: I1204 10:13:19.689107 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jlg9" Dec 04 10:13:19 crc kubenswrapper[4693]: I1204 10:13:19.830935 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ndp9\" (UniqueName: \"kubernetes.io/projected/3058aa9f-c978-40f1-842f-e4730e695c81-kube-api-access-6ndp9\") pod \"3058aa9f-c978-40f1-842f-e4730e695c81\" (UID: \"3058aa9f-c978-40f1-842f-e4730e695c81\") " Dec 04 10:13:19 crc kubenswrapper[4693]: I1204 10:13:19.831032 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3058aa9f-c978-40f1-842f-e4730e695c81-catalog-content\") pod \"3058aa9f-c978-40f1-842f-e4730e695c81\" (UID: \"3058aa9f-c978-40f1-842f-e4730e695c81\") " Dec 04 10:13:19 crc kubenswrapper[4693]: I1204 10:13:19.831052 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3058aa9f-c978-40f1-842f-e4730e695c81-utilities\") pod \"3058aa9f-c978-40f1-842f-e4730e695c81\" (UID: \"3058aa9f-c978-40f1-842f-e4730e695c81\") " Dec 04 10:13:19 crc kubenswrapper[4693]: I1204 10:13:19.832499 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3058aa9f-c978-40f1-842f-e4730e695c81-utilities" (OuterVolumeSpecName: "utilities") pod "3058aa9f-c978-40f1-842f-e4730e695c81" (UID: "3058aa9f-c978-40f1-842f-e4730e695c81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:13:19 crc kubenswrapper[4693]: I1204 10:13:19.838112 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3058aa9f-c978-40f1-842f-e4730e695c81-kube-api-access-6ndp9" (OuterVolumeSpecName: "kube-api-access-6ndp9") pod "3058aa9f-c978-40f1-842f-e4730e695c81" (UID: "3058aa9f-c978-40f1-842f-e4730e695c81"). InnerVolumeSpecName "kube-api-access-6ndp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:13:19 crc kubenswrapper[4693]: I1204 10:13:19.934644 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ndp9\" (UniqueName: \"kubernetes.io/projected/3058aa9f-c978-40f1-842f-e4730e695c81-kube-api-access-6ndp9\") on node \"crc\" DevicePath \"\"" Dec 04 10:13:19 crc kubenswrapper[4693]: I1204 10:13:19.934686 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3058aa9f-c978-40f1-842f-e4730e695c81-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:13:19 crc kubenswrapper[4693]: I1204 10:13:19.938798 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3058aa9f-c978-40f1-842f-e4730e695c81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3058aa9f-c978-40f1-842f-e4730e695c81" (UID: "3058aa9f-c978-40f1-842f-e4730e695c81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:13:20 crc kubenswrapper[4693]: I1204 10:13:20.036597 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3058aa9f-c978-40f1-842f-e4730e695c81-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:13:20 crc kubenswrapper[4693]: I1204 10:13:20.234579 4693 generic.go:334] "Generic (PLEG): container finished" podID="3058aa9f-c978-40f1-842f-e4730e695c81" containerID="2e9ea34fe6a88daaca3fbb6437ceac2f0f1ad45b4bef4d3d1116d3e150321b0b" exitCode=0 Dec 04 10:13:20 crc kubenswrapper[4693]: I1204 10:13:20.234653 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jlg9" event={"ID":"3058aa9f-c978-40f1-842f-e4730e695c81","Type":"ContainerDied","Data":"2e9ea34fe6a88daaca3fbb6437ceac2f0f1ad45b4bef4d3d1116d3e150321b0b"} Dec 04 10:13:20 crc kubenswrapper[4693]: I1204 10:13:20.234693 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jlg9" Dec 04 10:13:20 crc kubenswrapper[4693]: I1204 10:13:20.234706 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jlg9" event={"ID":"3058aa9f-c978-40f1-842f-e4730e695c81","Type":"ContainerDied","Data":"e6220f420c9d47fbea8cbded1ae69ac1226ceeb2efac454b77be1d20f0049ae3"} Dec 04 10:13:20 crc kubenswrapper[4693]: I1204 10:13:20.234745 4693 scope.go:117] "RemoveContainer" containerID="2e9ea34fe6a88daaca3fbb6437ceac2f0f1ad45b4bef4d3d1116d3e150321b0b" Dec 04 10:13:20 crc kubenswrapper[4693]: I1204 10:13:20.272231 4693 scope.go:117] "RemoveContainer" containerID="ace1ab8e081faa7e0e2b62990888e76d6aa420cc2995b3addb109423a300f5d6" Dec 04 10:13:20 crc kubenswrapper[4693]: I1204 10:13:20.274521 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9jlg9"] Dec 04 10:13:20 crc kubenswrapper[4693]: I1204 10:13:20.286895 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9jlg9"] Dec 04 10:13:20 crc kubenswrapper[4693]: I1204 10:13:20.313668 4693 scope.go:117] "RemoveContainer" containerID="94a5e7aa194517a2789e799f59dc1d2cdeb160c0695ca7dd7bb5af83e51d0771" Dec 04 10:13:20 crc kubenswrapper[4693]: I1204 10:13:20.388544 4693 scope.go:117] "RemoveContainer" containerID="2e9ea34fe6a88daaca3fbb6437ceac2f0f1ad45b4bef4d3d1116d3e150321b0b" Dec 04 10:13:20 crc kubenswrapper[4693]: E1204 10:13:20.389317 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e9ea34fe6a88daaca3fbb6437ceac2f0f1ad45b4bef4d3d1116d3e150321b0b\": container with ID starting with 2e9ea34fe6a88daaca3fbb6437ceac2f0f1ad45b4bef4d3d1116d3e150321b0b not found: ID does not exist" containerID="2e9ea34fe6a88daaca3fbb6437ceac2f0f1ad45b4bef4d3d1116d3e150321b0b" Dec 04 10:13:20 crc kubenswrapper[4693]: I1204 10:13:20.389388 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e9ea34fe6a88daaca3fbb6437ceac2f0f1ad45b4bef4d3d1116d3e150321b0b"} err="failed to get container status \"2e9ea34fe6a88daaca3fbb6437ceac2f0f1ad45b4bef4d3d1116d3e150321b0b\": rpc error: code = NotFound desc = could not find container \"2e9ea34fe6a88daaca3fbb6437ceac2f0f1ad45b4bef4d3d1116d3e150321b0b\": container with ID starting with 2e9ea34fe6a88daaca3fbb6437ceac2f0f1ad45b4bef4d3d1116d3e150321b0b not found: ID does not exist" Dec 04 10:13:20 crc kubenswrapper[4693]: I1204 10:13:20.389423 4693 scope.go:117] "RemoveContainer" containerID="ace1ab8e081faa7e0e2b62990888e76d6aa420cc2995b3addb109423a300f5d6" Dec 04 10:13:20 crc kubenswrapper[4693]: E1204 10:13:20.389874 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ace1ab8e081faa7e0e2b62990888e76d6aa420cc2995b3addb109423a300f5d6\": container with ID starting with ace1ab8e081faa7e0e2b62990888e76d6aa420cc2995b3addb109423a300f5d6 not found: ID does not exist" containerID="ace1ab8e081faa7e0e2b62990888e76d6aa420cc2995b3addb109423a300f5d6" Dec 04 10:13:20 crc kubenswrapper[4693]: I1204 10:13:20.389915 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace1ab8e081faa7e0e2b62990888e76d6aa420cc2995b3addb109423a300f5d6"} err="failed to get container status \"ace1ab8e081faa7e0e2b62990888e76d6aa420cc2995b3addb109423a300f5d6\": rpc error: code = NotFound desc = could not find container \"ace1ab8e081faa7e0e2b62990888e76d6aa420cc2995b3addb109423a300f5d6\": container with ID starting with ace1ab8e081faa7e0e2b62990888e76d6aa420cc2995b3addb109423a300f5d6 not found: ID does not exist" Dec 04 10:13:20 crc kubenswrapper[4693]: I1204 10:13:20.389942 4693 scope.go:117] "RemoveContainer" containerID="94a5e7aa194517a2789e799f59dc1d2cdeb160c0695ca7dd7bb5af83e51d0771" Dec 04 10:13:20 crc kubenswrapper[4693]: E1204 10:13:20.390199 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94a5e7aa194517a2789e799f59dc1d2cdeb160c0695ca7dd7bb5af83e51d0771\": container with ID starting with 94a5e7aa194517a2789e799f59dc1d2cdeb160c0695ca7dd7bb5af83e51d0771 not found: ID does not exist" containerID="94a5e7aa194517a2789e799f59dc1d2cdeb160c0695ca7dd7bb5af83e51d0771" Dec 04 10:13:20 crc kubenswrapper[4693]: I1204 10:13:20.390225 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94a5e7aa194517a2789e799f59dc1d2cdeb160c0695ca7dd7bb5af83e51d0771"} err="failed to get container status \"94a5e7aa194517a2789e799f59dc1d2cdeb160c0695ca7dd7bb5af83e51d0771\": rpc error: code = NotFound desc = could not find container \"94a5e7aa194517a2789e799f59dc1d2cdeb160c0695ca7dd7bb5af83e51d0771\": container with ID starting with 94a5e7aa194517a2789e799f59dc1d2cdeb160c0695ca7dd7bb5af83e51d0771 not found: ID does not exist" Dec 04 10:13:20 crc kubenswrapper[4693]: I1204 10:13:20.474607 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3058aa9f-c978-40f1-842f-e4730e695c81" path="/var/lib/kubelet/pods/3058aa9f-c978-40f1-842f-e4730e695c81/volumes" Dec 04 10:13:21 crc kubenswrapper[4693]: I1204 10:13:21.461761 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:13:21 crc kubenswrapper[4693]: E1204 10:13:21.462297 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:13:32 crc kubenswrapper[4693]: I1204 10:13:32.461643 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:13:32 crc kubenswrapper[4693]: E1204 10:13:32.462639 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:13:46 crc kubenswrapper[4693]: I1204 10:13:46.053797 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6716-account-create-update-v4sm8"] Dec 04 10:13:46 crc kubenswrapper[4693]: I1204 10:13:46.064771 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-ba22-account-create-update-vzw95"] Dec 04 10:13:46 crc kubenswrapper[4693]: I1204 10:13:46.074638 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-gb6kd"] Dec 04 10:13:46 crc kubenswrapper[4693]: I1204 10:13:46.085085 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6716-account-create-update-v4sm8"] Dec 04 10:13:46 crc kubenswrapper[4693]: I1204 10:13:46.095052 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-ba22-account-create-update-vzw95"] Dec 04 10:13:46 crc kubenswrapper[4693]: I1204 10:13:46.104291 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-84c8-account-create-update-wmv28"] Dec 04 10:13:46 crc kubenswrapper[4693]: I1204 10:13:46.113249 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-gb6kd"] Dec 04 10:13:46 crc kubenswrapper[4693]: I1204 10:13:46.122076 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-84c8-account-create-update-wmv28"] Dec 04 10:13:46 crc kubenswrapper[4693]: I1204 10:13:46.471465 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14f8a950-a259-4279-99e9-d33a4fc93e7d" path="/var/lib/kubelet/pods/14f8a950-a259-4279-99e9-d33a4fc93e7d/volumes" Dec 04 10:13:46 crc kubenswrapper[4693]: I1204 10:13:46.472676 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75" path="/var/lib/kubelet/pods/9ebe1ff2-6f41-4f5b-a346-5664fc8c1d75/volumes" Dec 04 10:13:46 crc kubenswrapper[4693]: I1204 10:13:46.473933 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1e98765-1b21-4b1a-80be-e5dc19d13082" path="/var/lib/kubelet/pods/b1e98765-1b21-4b1a-80be-e5dc19d13082/volumes" Dec 04 10:13:46 crc kubenswrapper[4693]: I1204 10:13:46.475451 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7f9fe71-06d0-4075-a262-74050f6b73d7" path="/var/lib/kubelet/pods/c7f9fe71-06d0-4075-a262-74050f6b73d7/volumes" Dec 04 10:13:47 crc kubenswrapper[4693]: I1204 10:13:47.033460 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-h22jp"] Dec 04 10:13:47 crc kubenswrapper[4693]: I1204 10:13:47.042706 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-ck5pq"] Dec 04 10:13:47 crc kubenswrapper[4693]: I1204 10:13:47.051867 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-ck5pq"] Dec 04 10:13:47 crc kubenswrapper[4693]: I1204 10:13:47.060721 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-h22jp"] Dec 04 10:13:47 crc kubenswrapper[4693]: I1204 10:13:47.461872 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:13:47 crc kubenswrapper[4693]: E1204 10:13:47.462154 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:13:48 crc kubenswrapper[4693]: I1204 10:13:48.471138 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="561053fe-0024-4f25-bfab-94a8e139ac06" path="/var/lib/kubelet/pods/561053fe-0024-4f25-bfab-94a8e139ac06/volumes" Dec 04 10:13:48 crc kubenswrapper[4693]: I1204 10:13:48.472548 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df7aea09-5790-4fda-9a37-8ada0326c2d0" path="/var/lib/kubelet/pods/df7aea09-5790-4fda-9a37-8ada0326c2d0/volumes" Dec 04 10:14:01 crc kubenswrapper[4693]: I1204 10:14:01.045740 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ae08-account-create-update-xbp2n"] Dec 04 10:14:01 crc kubenswrapper[4693]: I1204 10:14:01.058879 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-4211-account-create-update-r4vb9"] Dec 04 10:14:01 crc kubenswrapper[4693]: I1204 10:14:01.076573 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-j5pgf"] Dec 04 10:14:01 crc kubenswrapper[4693]: I1204 10:14:01.087490 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ef17-account-create-update-wcgbx"] Dec 04 10:14:01 crc kubenswrapper[4693]: I1204 10:14:01.096294 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-j5pgf"] Dec 04 10:14:01 crc kubenswrapper[4693]: I1204 10:14:01.104575 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-ef17-account-create-update-wcgbx"] Dec 04 10:14:01 crc kubenswrapper[4693]: I1204 10:14:01.136563 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ae08-account-create-update-xbp2n"] Dec 04 10:14:01 crc kubenswrapper[4693]: I1204 10:14:01.149193 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-4211-account-create-update-r4vb9"] Dec 04 10:14:01 crc kubenswrapper[4693]: I1204 10:14:01.160920 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-sk4bl"] Dec 04 10:14:01 crc kubenswrapper[4693]: I1204 10:14:01.172286 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-sk4bl"] Dec 04 10:14:01 crc kubenswrapper[4693]: I1204 10:14:01.185352 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0576-account-create-update-gqnh2"] Dec 04 10:14:01 crc kubenswrapper[4693]: I1204 10:14:01.197778 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-rc9st"] Dec 04 10:14:01 crc kubenswrapper[4693]: I1204 10:14:01.206739 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0576-account-create-update-gqnh2"] Dec 04 10:14:01 crc kubenswrapper[4693]: I1204 10:14:01.217812 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-rc9st"] Dec 04 10:14:01 crc kubenswrapper[4693]: I1204 10:14:01.461654 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:14:01 crc kubenswrapper[4693]: E1204 10:14:01.462038 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:14:02 crc kubenswrapper[4693]: I1204 10:14:02.475135 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c48aae9-77d8-4b25-989c-7b51c5938929" path="/var/lib/kubelet/pods/1c48aae9-77d8-4b25-989c-7b51c5938929/volumes" Dec 04 10:14:02 crc kubenswrapper[4693]: I1204 10:14:02.476368 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ba2460c-1d0a-4b7b-8159-5c94778aab54" path="/var/lib/kubelet/pods/4ba2460c-1d0a-4b7b-8159-5c94778aab54/volumes" Dec 04 10:14:02 crc kubenswrapper[4693]: I1204 10:14:02.477323 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55360827-2b39-4864-8d98-deb8d8fca9cc" path="/var/lib/kubelet/pods/55360827-2b39-4864-8d98-deb8d8fca9cc/volumes" Dec 04 10:14:02 crc kubenswrapper[4693]: I1204 10:14:02.478045 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6436522d-be9a-4cca-9928-8a001d22836e" path="/var/lib/kubelet/pods/6436522d-be9a-4cca-9928-8a001d22836e/volumes" Dec 04 10:14:02 crc kubenswrapper[4693]: I1204 10:14:02.479246 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cafe793-8113-4ef4-ada0-5e699a68ea59" path="/var/lib/kubelet/pods/7cafe793-8113-4ef4-ada0-5e699a68ea59/volumes" Dec 04 10:14:02 crc kubenswrapper[4693]: I1204 10:14:02.480577 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ed103ee-266a-4848-a410-2e01ea8f694a" path="/var/lib/kubelet/pods/7ed103ee-266a-4848-a410-2e01ea8f694a/volumes" Dec 04 10:14:02 crc kubenswrapper[4693]: I1204 10:14:02.481194 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a392a0d0-239b-4be6-896c-3da1a401c361" path="/var/lib/kubelet/pods/a392a0d0-239b-4be6-896c-3da1a401c361/volumes" Dec 04 10:14:07 crc kubenswrapper[4693]: I1204 10:14:07.935439 4693 scope.go:117] "RemoveContainer" containerID="6766b86962fb54c643e411c42447b10e6ebf764c420d665a426178b94907f5f9" Dec 04 10:14:07 crc kubenswrapper[4693]: I1204 10:14:07.977983 4693 scope.go:117] "RemoveContainer" containerID="662439998a3072442ca04691f337bcd55a5184dab1b0b487322fbf785af52ec7" Dec 04 10:14:08 crc kubenswrapper[4693]: I1204 10:14:08.013740 4693 scope.go:117] "RemoveContainer" containerID="728e17ef8b0681df2395de98aa589cd1ee8e766b880f4843de777c6c7352820f" Dec 04 10:14:08 crc kubenswrapper[4693]: I1204 10:14:08.060322 4693 scope.go:117] "RemoveContainer" containerID="31a4a898c67dbbd2395f3445f8b9591483d46d43dcd95e79b00268b365e55d7b" Dec 04 10:14:08 crc kubenswrapper[4693]: I1204 10:14:08.104614 4693 scope.go:117] "RemoveContainer" containerID="305c7316da231b936393e56f4c94571114d9dc064bbf3f8ce2d05010597a5964" Dec 04 10:14:08 crc kubenswrapper[4693]: I1204 10:14:08.157223 4693 scope.go:117] "RemoveContainer" containerID="c80f2f65e9a2b955f175661de526c8dc4736042f77434b1c2ccd6eea7df00287" Dec 04 10:14:08 crc kubenswrapper[4693]: I1204 10:14:08.216062 4693 scope.go:117] "RemoveContainer" containerID="0f8c449a5b8b7ad671abb92b5de3898b5c1062812568636c7b6e76bcce45212c" Dec 04 10:14:08 crc kubenswrapper[4693]: I1204 10:14:08.239447 4693 scope.go:117] "RemoveContainer" containerID="1da0bda23c5533ab29df254eb9925799aa521a038d9779a2295fedb2af048e9c" Dec 04 10:14:08 crc kubenswrapper[4693]: I1204 10:14:08.259274 4693 scope.go:117] "RemoveContainer" containerID="7ee28d74d1c1c14f248f78b43ea1615867ee2d955debb12ff85f1b44167e2b37" Dec 04 10:14:08 crc kubenswrapper[4693]: I1204 10:14:08.279632 4693 scope.go:117] "RemoveContainer" containerID="3e3f5a6d15a1b4477625cf1b05c6de6234542606e032cec47ed7a06b9f1e8a44" Dec 04 10:14:08 crc kubenswrapper[4693]: I1204 10:14:08.310623 4693 scope.go:117] "RemoveContainer" containerID="8b75043cd3fb50f0f4d747dd8ca8914186d73fadd142fd55ddb872745a3513e8" Dec 04 10:14:08 crc kubenswrapper[4693]: I1204 10:14:08.339166 4693 scope.go:117] "RemoveContainer" containerID="69327e0323421578e04dda6f967a1f364d5e266f184aad070cddcc2470dc1de2" Dec 04 10:14:08 crc kubenswrapper[4693]: I1204 10:14:08.364349 4693 scope.go:117] "RemoveContainer" containerID="83e70c585c53621a59ee096045bbd9feccbef3d920036abb57870ca78e08a542" Dec 04 10:14:10 crc kubenswrapper[4693]: I1204 10:14:10.027320 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-gxwt6"] Dec 04 10:14:10 crc kubenswrapper[4693]: I1204 10:14:10.041884 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-gxwt6"] Dec 04 10:14:10 crc kubenswrapper[4693]: I1204 10:14:10.472980 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d93060d5-ff5d-4f46-991d-b9b40c5d280d" path="/var/lib/kubelet/pods/d93060d5-ff5d-4f46-991d-b9b40c5d280d/volumes" Dec 04 10:14:16 crc kubenswrapper[4693]: I1204 10:14:16.464533 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:14:16 crc kubenswrapper[4693]: E1204 10:14:16.465451 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:14:24 crc kubenswrapper[4693]: I1204 10:14:24.064891 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-4zr9z"] Dec 04 10:14:24 crc kubenswrapper[4693]: I1204 10:14:24.076982 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-4zr9z"] Dec 04 10:14:24 crc kubenswrapper[4693]: I1204 10:14:24.471092 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e565feb-3867-434a-9b7e-9cae0a2f9152" path="/var/lib/kubelet/pods/4e565feb-3867-434a-9b7e-9cae0a2f9152/volumes" Dec 04 10:14:26 crc kubenswrapper[4693]: I1204 10:14:26.433774 4693 generic.go:334] "Generic (PLEG): container finished" podID="bd60e9f3-ac52-4a2b-9e3b-80720e7634ab" containerID="42e449664459b8264e78dc035d27f909016923b575457835a074c683ab9c0b34" exitCode=0 Dec 04 10:14:26 crc kubenswrapper[4693]: I1204 10:14:26.434078 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" event={"ID":"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab","Type":"ContainerDied","Data":"42e449664459b8264e78dc035d27f909016923b575457835a074c683ab9c0b34"} Dec 04 10:14:27 crc kubenswrapper[4693]: I1204 10:14:27.462175 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:14:27 crc kubenswrapper[4693]: E1204 10:14:27.462736 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:14:27 crc kubenswrapper[4693]: I1204 10:14:27.871564 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" Dec 04 10:14:27 crc kubenswrapper[4693]: I1204 10:14:27.962325 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-bootstrap-combined-ca-bundle\") pod \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\" (UID: \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\") " Dec 04 10:14:27 crc kubenswrapper[4693]: I1204 10:14:27.962532 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-inventory\") pod \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\" (UID: \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\") " Dec 04 10:14:27 crc kubenswrapper[4693]: I1204 10:14:27.962628 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txkh5\" (UniqueName: \"kubernetes.io/projected/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-kube-api-access-txkh5\") pod \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\" (UID: \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\") " Dec 04 10:14:27 crc kubenswrapper[4693]: I1204 10:14:27.962717 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-ssh-key\") pod \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\" (UID: \"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab\") " Dec 04 10:14:27 crc kubenswrapper[4693]: I1204 10:14:27.969789 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-kube-api-access-txkh5" (OuterVolumeSpecName: "kube-api-access-txkh5") pod "bd60e9f3-ac52-4a2b-9e3b-80720e7634ab" (UID: "bd60e9f3-ac52-4a2b-9e3b-80720e7634ab"). InnerVolumeSpecName "kube-api-access-txkh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:14:27 crc kubenswrapper[4693]: I1204 10:14:27.979468 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "bd60e9f3-ac52-4a2b-9e3b-80720e7634ab" (UID: "bd60e9f3-ac52-4a2b-9e3b-80720e7634ab"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:14:27 crc kubenswrapper[4693]: I1204 10:14:27.996387 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-inventory" (OuterVolumeSpecName: "inventory") pod "bd60e9f3-ac52-4a2b-9e3b-80720e7634ab" (UID: "bd60e9f3-ac52-4a2b-9e3b-80720e7634ab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:14:27 crc kubenswrapper[4693]: I1204 10:14:27.997445 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bd60e9f3-ac52-4a2b-9e3b-80720e7634ab" (UID: "bd60e9f3-ac52-4a2b-9e3b-80720e7634ab"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.066816 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txkh5\" (UniqueName: \"kubernetes.io/projected/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-kube-api-access-txkh5\") on node \"crc\" DevicePath \"\"" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.066858 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.066872 4693 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.066889 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd60e9f3-ac52-4a2b-9e3b-80720e7634ab-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.452258 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" event={"ID":"bd60e9f3-ac52-4a2b-9e3b-80720e7634ab","Type":"ContainerDied","Data":"f1b812916ebb634a43b201a1251d9935e4bc116451132abe7b7ff94e535f4f45"} Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.452295 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1b812916ebb634a43b201a1251d9935e4bc116451132abe7b7ff94e535f4f45" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.452303 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.561222 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz"] Dec 04 10:14:28 crc kubenswrapper[4693]: E1204 10:14:28.561769 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3058aa9f-c978-40f1-842f-e4730e695c81" containerName="extract-content" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.561786 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="3058aa9f-c978-40f1-842f-e4730e695c81" containerName="extract-content" Dec 04 10:14:28 crc kubenswrapper[4693]: E1204 10:14:28.561821 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3058aa9f-c978-40f1-842f-e4730e695c81" containerName="registry-server" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.561830 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="3058aa9f-c978-40f1-842f-e4730e695c81" containerName="registry-server" Dec 04 10:14:28 crc kubenswrapper[4693]: E1204 10:14:28.561853 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd60e9f3-ac52-4a2b-9e3b-80720e7634ab" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.561863 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd60e9f3-ac52-4a2b-9e3b-80720e7634ab" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 04 10:14:28 crc kubenswrapper[4693]: E1204 10:14:28.561888 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3058aa9f-c978-40f1-842f-e4730e695c81" containerName="extract-utilities" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.561898 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="3058aa9f-c978-40f1-842f-e4730e695c81" containerName="extract-utilities" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.562171 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd60e9f3-ac52-4a2b-9e3b-80720e7634ab" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.562211 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="3058aa9f-c978-40f1-842f-e4730e695c81" containerName="registry-server" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.563056 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.566192 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.566302 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzd9z" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.567239 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.567250 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.576534 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/710e62b8-160d-49f9-8bdb-418a0ee9f379-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz\" (UID: \"710e62b8-160d-49f9-8bdb-418a0ee9f379\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.576582 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/710e62b8-160d-49f9-8bdb-418a0ee9f379-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz\" (UID: \"710e62b8-160d-49f9-8bdb-418a0ee9f379\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.576644 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tncq8\" (UniqueName: \"kubernetes.io/projected/710e62b8-160d-49f9-8bdb-418a0ee9f379-kube-api-access-tncq8\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz\" (UID: \"710e62b8-160d-49f9-8bdb-418a0ee9f379\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.587029 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz"] Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.686926 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/710e62b8-160d-49f9-8bdb-418a0ee9f379-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz\" (UID: \"710e62b8-160d-49f9-8bdb-418a0ee9f379\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.687005 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/710e62b8-160d-49f9-8bdb-418a0ee9f379-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz\" (UID: \"710e62b8-160d-49f9-8bdb-418a0ee9f379\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.687065 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tncq8\" (UniqueName: \"kubernetes.io/projected/710e62b8-160d-49f9-8bdb-418a0ee9f379-kube-api-access-tncq8\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz\" (UID: \"710e62b8-160d-49f9-8bdb-418a0ee9f379\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.692188 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/710e62b8-160d-49f9-8bdb-418a0ee9f379-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz\" (UID: \"710e62b8-160d-49f9-8bdb-418a0ee9f379\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.702888 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/710e62b8-160d-49f9-8bdb-418a0ee9f379-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz\" (UID: \"710e62b8-160d-49f9-8bdb-418a0ee9f379\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.706741 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tncq8\" (UniqueName: \"kubernetes.io/projected/710e62b8-160d-49f9-8bdb-418a0ee9f379-kube-api-access-tncq8\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz\" (UID: \"710e62b8-160d-49f9-8bdb-418a0ee9f379\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz" Dec 04 10:14:28 crc kubenswrapper[4693]: I1204 10:14:28.898664 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz" Dec 04 10:14:29 crc kubenswrapper[4693]: I1204 10:14:29.281102 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz"] Dec 04 10:14:29 crc kubenswrapper[4693]: I1204 10:14:29.461488 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz" event={"ID":"710e62b8-160d-49f9-8bdb-418a0ee9f379","Type":"ContainerStarted","Data":"201590ae42dcd70dbabbb244e59286af95f2fcd0be2473d4dd0abdf87b14f854"} Dec 04 10:14:30 crc kubenswrapper[4693]: I1204 10:14:30.474648 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz" event={"ID":"710e62b8-160d-49f9-8bdb-418a0ee9f379","Type":"ContainerStarted","Data":"4199d5c8518db722f3bbdc8e96ab9c5125bbee426f9e65547624fb5ee9d5fcce"} Dec 04 10:14:30 crc kubenswrapper[4693]: I1204 10:14:30.500084 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz" podStartSLOduration=2.055818465 podStartE2EDuration="2.500060481s" podCreationTimestamp="2025-12-04 10:14:28 +0000 UTC" firstStartedPulling="2025-12-04 10:14:29.284208846 +0000 UTC m=+1915.181802609" lastFinishedPulling="2025-12-04 10:14:29.728450872 +0000 UTC m=+1915.626044625" observedRunningTime="2025-12-04 10:14:30.485924684 +0000 UTC m=+1916.383518437" watchObservedRunningTime="2025-12-04 10:14:30.500060481 +0000 UTC m=+1916.397654234" Dec 04 10:14:39 crc kubenswrapper[4693]: I1204 10:14:39.461411 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:14:39 crc kubenswrapper[4693]: E1204 10:14:39.462095 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:14:50 crc kubenswrapper[4693]: I1204 10:14:50.061437 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-9q6d6"] Dec 04 10:14:50 crc kubenswrapper[4693]: I1204 10:14:50.076827 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-9q6d6"] Dec 04 10:14:50 crc kubenswrapper[4693]: I1204 10:14:50.474173 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6343580b-81bd-4993-a298-3b31730e6ae3" path="/var/lib/kubelet/pods/6343580b-81bd-4993-a298-3b31730e6ae3/volumes" Dec 04 10:14:53 crc kubenswrapper[4693]: I1204 10:14:53.461013 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:14:53 crc kubenswrapper[4693]: E1204 10:14:53.461515 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:15:00 crc kubenswrapper[4693]: I1204 10:15:00.143673 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk"] Dec 04 10:15:00 crc kubenswrapper[4693]: I1204 10:15:00.145502 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk" Dec 04 10:15:00 crc kubenswrapper[4693]: I1204 10:15:00.147455 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 10:15:00 crc kubenswrapper[4693]: I1204 10:15:00.149125 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 10:15:00 crc kubenswrapper[4693]: I1204 10:15:00.153851 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk"] Dec 04 10:15:00 crc kubenswrapper[4693]: I1204 10:15:00.309116 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea614f99-9d53-4a2e-b796-e2e603bac316-config-volume\") pod \"collect-profiles-29414055-gfcpk\" (UID: \"ea614f99-9d53-4a2e-b796-e2e603bac316\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk" Dec 04 10:15:00 crc kubenswrapper[4693]: I1204 10:15:00.309414 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ktwv\" (UniqueName: \"kubernetes.io/projected/ea614f99-9d53-4a2e-b796-e2e603bac316-kube-api-access-7ktwv\") pod \"collect-profiles-29414055-gfcpk\" (UID: \"ea614f99-9d53-4a2e-b796-e2e603bac316\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk" Dec 04 10:15:00 crc kubenswrapper[4693]: I1204 10:15:00.309466 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea614f99-9d53-4a2e-b796-e2e603bac316-secret-volume\") pod \"collect-profiles-29414055-gfcpk\" (UID: \"ea614f99-9d53-4a2e-b796-e2e603bac316\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk" Dec 04 10:15:00 crc kubenswrapper[4693]: I1204 10:15:00.411162 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ktwv\" (UniqueName: \"kubernetes.io/projected/ea614f99-9d53-4a2e-b796-e2e603bac316-kube-api-access-7ktwv\") pod \"collect-profiles-29414055-gfcpk\" (UID: \"ea614f99-9d53-4a2e-b796-e2e603bac316\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk" Dec 04 10:15:00 crc kubenswrapper[4693]: I1204 10:15:00.411203 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea614f99-9d53-4a2e-b796-e2e603bac316-secret-volume\") pod \"collect-profiles-29414055-gfcpk\" (UID: \"ea614f99-9d53-4a2e-b796-e2e603bac316\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk" Dec 04 10:15:00 crc kubenswrapper[4693]: I1204 10:15:00.411248 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea614f99-9d53-4a2e-b796-e2e603bac316-config-volume\") pod \"collect-profiles-29414055-gfcpk\" (UID: \"ea614f99-9d53-4a2e-b796-e2e603bac316\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk" Dec 04 10:15:00 crc kubenswrapper[4693]: I1204 10:15:00.412188 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea614f99-9d53-4a2e-b796-e2e603bac316-config-volume\") pod \"collect-profiles-29414055-gfcpk\" (UID: \"ea614f99-9d53-4a2e-b796-e2e603bac316\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk" Dec 04 10:15:00 crc kubenswrapper[4693]: I1204 10:15:00.416566 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea614f99-9d53-4a2e-b796-e2e603bac316-secret-volume\") pod \"collect-profiles-29414055-gfcpk\" (UID: \"ea614f99-9d53-4a2e-b796-e2e603bac316\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk" Dec 04 10:15:00 crc kubenswrapper[4693]: I1204 10:15:00.429382 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ktwv\" (UniqueName: \"kubernetes.io/projected/ea614f99-9d53-4a2e-b796-e2e603bac316-kube-api-access-7ktwv\") pod \"collect-profiles-29414055-gfcpk\" (UID: \"ea614f99-9d53-4a2e-b796-e2e603bac316\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk" Dec 04 10:15:00 crc kubenswrapper[4693]: I1204 10:15:00.470040 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk" Dec 04 10:15:00 crc kubenswrapper[4693]: I1204 10:15:00.922231 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk"] Dec 04 10:15:01 crc kubenswrapper[4693]: I1204 10:15:01.735045 4693 generic.go:334] "Generic (PLEG): container finished" podID="ea614f99-9d53-4a2e-b796-e2e603bac316" containerID="74364817e68e39e08e34c2f883ea44961f45d87f86e3c63a855eacbf953ec317" exitCode=0 Dec 04 10:15:01 crc kubenswrapper[4693]: I1204 10:15:01.735096 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk" event={"ID":"ea614f99-9d53-4a2e-b796-e2e603bac316","Type":"ContainerDied","Data":"74364817e68e39e08e34c2f883ea44961f45d87f86e3c63a855eacbf953ec317"} Dec 04 10:15:01 crc kubenswrapper[4693]: I1204 10:15:01.735360 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk" event={"ID":"ea614f99-9d53-4a2e-b796-e2e603bac316","Type":"ContainerStarted","Data":"7ebc0961cd92cca31af09dd7c6e08b3a817098ef94ae47447663a967d78bc746"} Dec 04 10:15:03 crc kubenswrapper[4693]: I1204 10:15:03.099808 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk" Dec 04 10:15:03 crc kubenswrapper[4693]: I1204 10:15:03.263804 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea614f99-9d53-4a2e-b796-e2e603bac316-secret-volume\") pod \"ea614f99-9d53-4a2e-b796-e2e603bac316\" (UID: \"ea614f99-9d53-4a2e-b796-e2e603bac316\") " Dec 04 10:15:03 crc kubenswrapper[4693]: I1204 10:15:03.263858 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea614f99-9d53-4a2e-b796-e2e603bac316-config-volume\") pod \"ea614f99-9d53-4a2e-b796-e2e603bac316\" (UID: \"ea614f99-9d53-4a2e-b796-e2e603bac316\") " Dec 04 10:15:03 crc kubenswrapper[4693]: I1204 10:15:03.264025 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ktwv\" (UniqueName: \"kubernetes.io/projected/ea614f99-9d53-4a2e-b796-e2e603bac316-kube-api-access-7ktwv\") pod \"ea614f99-9d53-4a2e-b796-e2e603bac316\" (UID: \"ea614f99-9d53-4a2e-b796-e2e603bac316\") " Dec 04 10:15:03 crc kubenswrapper[4693]: I1204 10:15:03.265005 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea614f99-9d53-4a2e-b796-e2e603bac316-config-volume" (OuterVolumeSpecName: "config-volume") pod "ea614f99-9d53-4a2e-b796-e2e603bac316" (UID: "ea614f99-9d53-4a2e-b796-e2e603bac316"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:03 crc kubenswrapper[4693]: I1204 10:15:03.269050 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea614f99-9d53-4a2e-b796-e2e603bac316-kube-api-access-7ktwv" (OuterVolumeSpecName: "kube-api-access-7ktwv") pod "ea614f99-9d53-4a2e-b796-e2e603bac316" (UID: "ea614f99-9d53-4a2e-b796-e2e603bac316"). InnerVolumeSpecName "kube-api-access-7ktwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:03 crc kubenswrapper[4693]: I1204 10:15:03.269253 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea614f99-9d53-4a2e-b796-e2e603bac316-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ea614f99-9d53-4a2e-b796-e2e603bac316" (UID: "ea614f99-9d53-4a2e-b796-e2e603bac316"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:03 crc kubenswrapper[4693]: I1204 10:15:03.365840 4693 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea614f99-9d53-4a2e-b796-e2e603bac316-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:03 crc kubenswrapper[4693]: I1204 10:15:03.365881 4693 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea614f99-9d53-4a2e-b796-e2e603bac316-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:03 crc kubenswrapper[4693]: I1204 10:15:03.365916 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ktwv\" (UniqueName: \"kubernetes.io/projected/ea614f99-9d53-4a2e-b796-e2e603bac316-kube-api-access-7ktwv\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:03 crc kubenswrapper[4693]: I1204 10:15:03.764404 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk" event={"ID":"ea614f99-9d53-4a2e-b796-e2e603bac316","Type":"ContainerDied","Data":"7ebc0961cd92cca31af09dd7c6e08b3a817098ef94ae47447663a967d78bc746"} Dec 04 10:15:03 crc kubenswrapper[4693]: I1204 10:15:03.764441 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk" Dec 04 10:15:03 crc kubenswrapper[4693]: I1204 10:15:03.764444 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ebc0961cd92cca31af09dd7c6e08b3a817098ef94ae47447663a967d78bc746" Dec 04 10:15:04 crc kubenswrapper[4693]: I1204 10:15:04.162581 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw"] Dec 04 10:15:04 crc kubenswrapper[4693]: I1204 10:15:04.172369 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414010-4vpsw"] Dec 04 10:15:04 crc kubenswrapper[4693]: I1204 10:15:04.514850 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36782e8d-b271-46a5-8f96-8979022991f2" path="/var/lib/kubelet/pods/36782e8d-b271-46a5-8f96-8979022991f2/volumes" Dec 04 10:15:05 crc kubenswrapper[4693]: I1204 10:15:05.461432 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:15:05 crc kubenswrapper[4693]: E1204 10:15:05.461932 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:15:08 crc kubenswrapper[4693]: I1204 10:15:08.041310 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-fc99g"] Dec 04 10:15:08 crc kubenswrapper[4693]: I1204 10:15:08.052775 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-fc99g"] Dec 04 10:15:08 crc kubenswrapper[4693]: I1204 10:15:08.471521 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="228babee-6748-4512-bd76-92168eab2e2d" path="/var/lib/kubelet/pods/228babee-6748-4512-bd76-92168eab2e2d/volumes" Dec 04 10:15:08 crc kubenswrapper[4693]: I1204 10:15:08.611772 4693 scope.go:117] "RemoveContainer" containerID="b85d1a787743ad35f3711e020be32203b24a8d6da34cb07a14f0877594cbd4e4" Dec 04 10:15:08 crc kubenswrapper[4693]: I1204 10:15:08.633826 4693 scope.go:117] "RemoveContainer" containerID="dab9c1011ea07d88de65b3480f049b963e02c9aff2d843592df61f618cb2f585" Dec 04 10:15:08 crc kubenswrapper[4693]: I1204 10:15:08.680085 4693 scope.go:117] "RemoveContainer" containerID="fbf61924d8fb146210c44e926db797ef98bfd70f46df952c59704501d3c5dbb2" Dec 04 10:15:08 crc kubenswrapper[4693]: I1204 10:15:08.744491 4693 scope.go:117] "RemoveContainer" containerID="510ab1cfe0a6633c3ddde08be3570e8036917a89eebd96b3ae933a4e05c57267" Dec 04 10:15:08 crc kubenswrapper[4693]: I1204 10:15:08.772667 4693 scope.go:117] "RemoveContainer" containerID="bff8e3ebeba716474615ecef4839fdd6eea301daa1ae5f089ff8b5e4b1c0f79b" Dec 04 10:15:18 crc kubenswrapper[4693]: I1204 10:15:18.462075 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:15:18 crc kubenswrapper[4693]: E1204 10:15:18.463573 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:15:25 crc kubenswrapper[4693]: I1204 10:15:25.075426 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-srzft"] Dec 04 10:15:25 crc kubenswrapper[4693]: I1204 10:15:25.092446 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-srzft"] Dec 04 10:15:26 crc kubenswrapper[4693]: I1204 10:15:26.472154 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af7a03e1-cb13-4536-9405-791381101cdc" path="/var/lib/kubelet/pods/af7a03e1-cb13-4536-9405-791381101cdc/volumes" Dec 04 10:15:27 crc kubenswrapper[4693]: I1204 10:15:27.032360 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jw5j4"] Dec 04 10:15:27 crc kubenswrapper[4693]: I1204 10:15:27.041199 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jw5j4"] Dec 04 10:15:28 crc kubenswrapper[4693]: I1204 10:15:28.471020 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00f3c238-cf53-4a99-96da-ae6118b711b4" path="/var/lib/kubelet/pods/00f3c238-cf53-4a99-96da-ae6118b711b4/volumes" Dec 04 10:15:32 crc kubenswrapper[4693]: I1204 10:15:32.461596 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:15:32 crc kubenswrapper[4693]: E1204 10:15:32.463742 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:15:47 crc kubenswrapper[4693]: I1204 10:15:47.461993 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:15:47 crc kubenswrapper[4693]: E1204 10:15:47.463261 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:15:59 crc kubenswrapper[4693]: I1204 10:15:59.461870 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:16:00 crc kubenswrapper[4693]: I1204 10:16:00.267489 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"cb78c8a8470bb98259f762ceb5868f195a5cc40bc0fad1b334b0581114dcc9f0"} Dec 04 10:16:06 crc kubenswrapper[4693]: I1204 10:16:06.038853 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-p69cr"] Dec 04 10:16:06 crc kubenswrapper[4693]: I1204 10:16:06.048493 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-p69cr"] Dec 04 10:16:06 crc kubenswrapper[4693]: I1204 10:16:06.474197 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a62cb864-103c-4b89-afeb-8397af4046cb" path="/var/lib/kubelet/pods/a62cb864-103c-4b89-afeb-8397af4046cb/volumes" Dec 04 10:16:08 crc kubenswrapper[4693]: I1204 10:16:08.946280 4693 scope.go:117] "RemoveContainer" containerID="714f5dd6229837b0c880729a145472584d9396fcefc23b643b9687974c4c9ffe" Dec 04 10:16:09 crc kubenswrapper[4693]: I1204 10:16:09.457834 4693 scope.go:117] "RemoveContainer" containerID="cf6afee287e4f350cd3e182859a40e414bd245cc2f53f5a178eb6dca2920a8c6" Dec 04 10:16:09 crc kubenswrapper[4693]: I1204 10:16:09.489276 4693 scope.go:117] "RemoveContainer" containerID="3b7577f95e018f83d92ce7dc59d26f9be7ad89e895504b1685de208e6d55b1c3" Dec 04 10:16:25 crc kubenswrapper[4693]: I1204 10:16:25.037539 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-q642m"] Dec 04 10:16:25 crc kubenswrapper[4693]: I1204 10:16:25.045090 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-q642m"] Dec 04 10:16:26 crc kubenswrapper[4693]: I1204 10:16:26.472886 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="832e603c-b695-442e-bcf6-fa322cfc1524" path="/var/lib/kubelet/pods/832e603c-b695-442e-bcf6-fa322cfc1524/volumes" Dec 04 10:16:35 crc kubenswrapper[4693]: I1204 10:16:35.033613 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-0a0a-account-create-update-5cbb7"] Dec 04 10:16:35 crc kubenswrapper[4693]: I1204 10:16:35.043147 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-tjkb9"] Dec 04 10:16:35 crc kubenswrapper[4693]: I1204 10:16:35.052161 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-4252s"] Dec 04 10:16:35 crc kubenswrapper[4693]: I1204 10:16:35.063718 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-0a0a-account-create-update-5cbb7"] Dec 04 10:16:35 crc kubenswrapper[4693]: I1204 10:16:35.074279 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-hllxc"] Dec 04 10:16:35 crc kubenswrapper[4693]: I1204 10:16:35.082314 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-hllxc"] Dec 04 10:16:35 crc kubenswrapper[4693]: I1204 10:16:35.089699 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-4252s"] Dec 04 10:16:35 crc kubenswrapper[4693]: I1204 10:16:35.096775 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-tjkb9"] Dec 04 10:16:36 crc kubenswrapper[4693]: I1204 10:16:36.050115 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-752a-account-create-update-xdxqj"] Dec 04 10:16:36 crc kubenswrapper[4693]: I1204 10:16:36.060057 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-752a-account-create-update-xdxqj"] Dec 04 10:16:36 crc kubenswrapper[4693]: I1204 10:16:36.070321 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f3cd-account-create-update-5sv79"] Dec 04 10:16:36 crc kubenswrapper[4693]: I1204 10:16:36.076260 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-f3cd-account-create-update-5sv79"] Dec 04 10:16:36 crc kubenswrapper[4693]: I1204 10:16:36.484440 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00409360-0d5e-4451-b83d-84fbbf011c66" path="/var/lib/kubelet/pods/00409360-0d5e-4451-b83d-84fbbf011c66/volumes" Dec 04 10:16:36 crc kubenswrapper[4693]: I1204 10:16:36.485238 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0805403f-8f31-4183-a1b6-d1eedcb64a8d" path="/var/lib/kubelet/pods/0805403f-8f31-4183-a1b6-d1eedcb64a8d/volumes" Dec 04 10:16:36 crc kubenswrapper[4693]: I1204 10:16:36.485865 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e19a65e-3f81-4663-8041-8f2186d3d6c2" path="/var/lib/kubelet/pods/1e19a65e-3f81-4663-8041-8f2186d3d6c2/volumes" Dec 04 10:16:36 crc kubenswrapper[4693]: I1204 10:16:36.486458 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="430ee024-3291-4a26-865f-4d1300bf5ea9" path="/var/lib/kubelet/pods/430ee024-3291-4a26-865f-4d1300bf5ea9/volumes" Dec 04 10:16:36 crc kubenswrapper[4693]: I1204 10:16:36.487788 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e822040-7ff2-49be-bd68-70dc15db9ff9" path="/var/lib/kubelet/pods/8e822040-7ff2-49be-bd68-70dc15db9ff9/volumes" Dec 04 10:16:36 crc kubenswrapper[4693]: I1204 10:16:36.488349 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0dd0393-4f73-4cbd-be2a-4b4471ea154c" path="/var/lib/kubelet/pods/a0dd0393-4f73-4cbd-be2a-4b4471ea154c/volumes" Dec 04 10:16:41 crc kubenswrapper[4693]: I1204 10:16:41.420520 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-lvx99"] Dec 04 10:16:41 crc kubenswrapper[4693]: I1204 10:16:41.432618 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-lvx99"] Dec 04 10:16:42 crc kubenswrapper[4693]: I1204 10:16:42.476706 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d" path="/var/lib/kubelet/pods/7c3ad32c-eb55-44e3-bd1e-caa1409f0b1d/volumes" Dec 04 10:16:52 crc kubenswrapper[4693]: I1204 10:16:52.531396 4693 generic.go:334] "Generic (PLEG): container finished" podID="710e62b8-160d-49f9-8bdb-418a0ee9f379" containerID="4199d5c8518db722f3bbdc8e96ab9c5125bbee426f9e65547624fb5ee9d5fcce" exitCode=0 Dec 04 10:16:52 crc kubenswrapper[4693]: I1204 10:16:52.531517 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz" event={"ID":"710e62b8-160d-49f9-8bdb-418a0ee9f379","Type":"ContainerDied","Data":"4199d5c8518db722f3bbdc8e96ab9c5125bbee426f9e65547624fb5ee9d5fcce"} Dec 04 10:16:53 crc kubenswrapper[4693]: I1204 10:16:53.969921 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.152520 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/710e62b8-160d-49f9-8bdb-418a0ee9f379-ssh-key\") pod \"710e62b8-160d-49f9-8bdb-418a0ee9f379\" (UID: \"710e62b8-160d-49f9-8bdb-418a0ee9f379\") " Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.152624 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/710e62b8-160d-49f9-8bdb-418a0ee9f379-inventory\") pod \"710e62b8-160d-49f9-8bdb-418a0ee9f379\" (UID: \"710e62b8-160d-49f9-8bdb-418a0ee9f379\") " Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.152786 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tncq8\" (UniqueName: \"kubernetes.io/projected/710e62b8-160d-49f9-8bdb-418a0ee9f379-kube-api-access-tncq8\") pod \"710e62b8-160d-49f9-8bdb-418a0ee9f379\" (UID: \"710e62b8-160d-49f9-8bdb-418a0ee9f379\") " Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.157671 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/710e62b8-160d-49f9-8bdb-418a0ee9f379-kube-api-access-tncq8" (OuterVolumeSpecName: "kube-api-access-tncq8") pod "710e62b8-160d-49f9-8bdb-418a0ee9f379" (UID: "710e62b8-160d-49f9-8bdb-418a0ee9f379"). InnerVolumeSpecName "kube-api-access-tncq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.178753 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/710e62b8-160d-49f9-8bdb-418a0ee9f379-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "710e62b8-160d-49f9-8bdb-418a0ee9f379" (UID: "710e62b8-160d-49f9-8bdb-418a0ee9f379"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.179310 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/710e62b8-160d-49f9-8bdb-418a0ee9f379-inventory" (OuterVolumeSpecName: "inventory") pod "710e62b8-160d-49f9-8bdb-418a0ee9f379" (UID: "710e62b8-160d-49f9-8bdb-418a0ee9f379"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.254865 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tncq8\" (UniqueName: \"kubernetes.io/projected/710e62b8-160d-49f9-8bdb-418a0ee9f379-kube-api-access-tncq8\") on node \"crc\" DevicePath \"\"" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.254902 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/710e62b8-160d-49f9-8bdb-418a0ee9f379-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.254916 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/710e62b8-160d-49f9-8bdb-418a0ee9f379-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.550234 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz" event={"ID":"710e62b8-160d-49f9-8bdb-418a0ee9f379","Type":"ContainerDied","Data":"201590ae42dcd70dbabbb244e59286af95f2fcd0be2473d4dd0abdf87b14f854"} Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.550269 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="201590ae42dcd70dbabbb244e59286af95f2fcd0be2473d4dd0abdf87b14f854" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.550291 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.629538 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5"] Dec 04 10:16:54 crc kubenswrapper[4693]: E1204 10:16:54.629924 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="710e62b8-160d-49f9-8bdb-418a0ee9f379" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.629942 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="710e62b8-160d-49f9-8bdb-418a0ee9f379" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 04 10:16:54 crc kubenswrapper[4693]: E1204 10:16:54.629968 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea614f99-9d53-4a2e-b796-e2e603bac316" containerName="collect-profiles" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.629974 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea614f99-9d53-4a2e-b796-e2e603bac316" containerName="collect-profiles" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.630166 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="710e62b8-160d-49f9-8bdb-418a0ee9f379" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.630180 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea614f99-9d53-4a2e-b796-e2e603bac316" containerName="collect-profiles" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.630827 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.633950 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.634443 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.634732 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzd9z" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.635513 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.641510 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5"] Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.787161 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm87m\" (UniqueName: \"kubernetes.io/projected/5aeee95a-198f-47ed-859b-0f710da9768c-kube-api-access-xm87m\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-994x5\" (UID: \"5aeee95a-198f-47ed-859b-0f710da9768c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.787266 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5aeee95a-198f-47ed-859b-0f710da9768c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-994x5\" (UID: \"5aeee95a-198f-47ed-859b-0f710da9768c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.787484 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5aeee95a-198f-47ed-859b-0f710da9768c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-994x5\" (UID: \"5aeee95a-198f-47ed-859b-0f710da9768c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.889279 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5aeee95a-198f-47ed-859b-0f710da9768c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-994x5\" (UID: \"5aeee95a-198f-47ed-859b-0f710da9768c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.889823 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm87m\" (UniqueName: \"kubernetes.io/projected/5aeee95a-198f-47ed-859b-0f710da9768c-kube-api-access-xm87m\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-994x5\" (UID: \"5aeee95a-198f-47ed-859b-0f710da9768c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.889925 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5aeee95a-198f-47ed-859b-0f710da9768c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-994x5\" (UID: \"5aeee95a-198f-47ed-859b-0f710da9768c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.895777 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5aeee95a-198f-47ed-859b-0f710da9768c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-994x5\" (UID: \"5aeee95a-198f-47ed-859b-0f710da9768c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.906592 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5aeee95a-198f-47ed-859b-0f710da9768c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-994x5\" (UID: \"5aeee95a-198f-47ed-859b-0f710da9768c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.906891 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm87m\" (UniqueName: \"kubernetes.io/projected/5aeee95a-198f-47ed-859b-0f710da9768c-kube-api-access-xm87m\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-994x5\" (UID: \"5aeee95a-198f-47ed-859b-0f710da9768c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5" Dec 04 10:16:54 crc kubenswrapper[4693]: I1204 10:16:54.957440 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5" Dec 04 10:16:55 crc kubenswrapper[4693]: I1204 10:16:55.480434 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:16:55 crc kubenswrapper[4693]: I1204 10:16:55.480638 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5"] Dec 04 10:16:55 crc kubenswrapper[4693]: I1204 10:16:55.561558 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5" event={"ID":"5aeee95a-198f-47ed-859b-0f710da9768c","Type":"ContainerStarted","Data":"1b813f87dfa2dce54e148fa3afbffc4b08184646590a554612826ac3bd45742d"} Dec 04 10:16:56 crc kubenswrapper[4693]: I1204 10:16:56.570906 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5" event={"ID":"5aeee95a-198f-47ed-859b-0f710da9768c","Type":"ContainerStarted","Data":"db5094aec04520fd931a91c2dabdd0392e82ef7b36f604782a5a13ee6c3296ed"} Dec 04 10:16:56 crc kubenswrapper[4693]: I1204 10:16:56.586228 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5" podStartSLOduration=2.007338762 podStartE2EDuration="2.586211038s" podCreationTimestamp="2025-12-04 10:16:54 +0000 UTC" firstStartedPulling="2025-12-04 10:16:55.480180773 +0000 UTC m=+2061.377774526" lastFinishedPulling="2025-12-04 10:16:56.059053059 +0000 UTC m=+2061.956646802" observedRunningTime="2025-12-04 10:16:56.583919045 +0000 UTC m=+2062.481512798" watchObservedRunningTime="2025-12-04 10:16:56.586211038 +0000 UTC m=+2062.483804791" Dec 04 10:17:09 crc kubenswrapper[4693]: I1204 10:17:09.613867 4693 scope.go:117] "RemoveContainer" containerID="e1856c4bcc5f7d4dd767ab5d0ee7c26159e7fce3876db52d046b3e1b06a476cf" Dec 04 10:17:09 crc kubenswrapper[4693]: I1204 10:17:09.653804 4693 scope.go:117] "RemoveContainer" containerID="1130c7cc91ecaad62e289e83e7d312bb1bb1acedcefe385696f1aa2de6f6b2e2" Dec 04 10:17:09 crc kubenswrapper[4693]: I1204 10:17:09.695156 4693 scope.go:117] "RemoveContainer" containerID="1403c2a6800e91b083b88e01018f1c82e58505ca5db9710670f9173d0b49fc74" Dec 04 10:17:09 crc kubenswrapper[4693]: I1204 10:17:09.729485 4693 scope.go:117] "RemoveContainer" containerID="095835de9d6a0d01ca21c9eb02b0bd85fe078f73384a7b8dbc9a177551b4bcb4" Dec 04 10:17:09 crc kubenswrapper[4693]: I1204 10:17:09.772456 4693 scope.go:117] "RemoveContainer" containerID="309b3a1bca1f57660263110cf5e57e4c7231c19b667ca928f625270e3c3e1eaa" Dec 04 10:17:09 crc kubenswrapper[4693]: I1204 10:17:09.823881 4693 scope.go:117] "RemoveContainer" containerID="7a63ce2b45cffe3f9b5a685e4ac1159c6ded18084f9dae5f791bdd6fb3752530" Dec 04 10:17:09 crc kubenswrapper[4693]: I1204 10:17:09.865413 4693 scope.go:117] "RemoveContainer" containerID="50639091b6440ac439ad39a9f535dc9133f75266600ec01466240f152f9a11af" Dec 04 10:17:09 crc kubenswrapper[4693]: I1204 10:17:09.883288 4693 scope.go:117] "RemoveContainer" containerID="ceeddbea04eada8185ac7490c16a472d90714892ff0cec6ab7c2bb7bc1fcc931" Dec 04 10:17:48 crc kubenswrapper[4693]: I1204 10:17:48.053463 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ttkgx"] Dec 04 10:17:48 crc kubenswrapper[4693]: I1204 10:17:48.062103 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ttkgx"] Dec 04 10:17:48 crc kubenswrapper[4693]: I1204 10:17:48.479680 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67" path="/var/lib/kubelet/pods/b0a25a4b-56b5-461e-b9f0-44fe3b1f5e67/volumes" Dec 04 10:18:10 crc kubenswrapper[4693]: I1204 10:18:10.039617 4693 scope.go:117] "RemoveContainer" containerID="ad02b5a091f3913441761bde718ecf4dac9d0711387f15d0dc6554de601b3e14" Dec 04 10:18:14 crc kubenswrapper[4693]: I1204 10:18:14.033614 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-x2n7g"] Dec 04 10:18:14 crc kubenswrapper[4693]: I1204 10:18:14.044137 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-x2n7g"] Dec 04 10:18:14 crc kubenswrapper[4693]: I1204 10:18:14.484685 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e81cf43-7573-4691-878b-eeae474d75be" path="/var/lib/kubelet/pods/5e81cf43-7573-4691-878b-eeae474d75be/volumes" Dec 04 10:18:15 crc kubenswrapper[4693]: I1204 10:18:15.027433 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-xfxvd"] Dec 04 10:18:15 crc kubenswrapper[4693]: I1204 10:18:15.035489 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-xfxvd"] Dec 04 10:18:16 crc kubenswrapper[4693]: I1204 10:18:16.478936 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="713c9761-3dbd-4889-9678-9acee2bd6635" path="/var/lib/kubelet/pods/713c9761-3dbd-4889-9678-9acee2bd6635/volumes" Dec 04 10:18:22 crc kubenswrapper[4693]: I1204 10:18:22.273158 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:18:22 crc kubenswrapper[4693]: I1204 10:18:22.273811 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:18:28 crc kubenswrapper[4693]: I1204 10:18:28.376278 4693 generic.go:334] "Generic (PLEG): container finished" podID="5aeee95a-198f-47ed-859b-0f710da9768c" containerID="db5094aec04520fd931a91c2dabdd0392e82ef7b36f604782a5a13ee6c3296ed" exitCode=0 Dec 04 10:18:28 crc kubenswrapper[4693]: I1204 10:18:28.376524 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5" event={"ID":"5aeee95a-198f-47ed-859b-0f710da9768c","Type":"ContainerDied","Data":"db5094aec04520fd931a91c2dabdd0392e82ef7b36f604782a5a13ee6c3296ed"} Dec 04 10:18:29 crc kubenswrapper[4693]: I1204 10:18:29.863627 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5" Dec 04 10:18:29 crc kubenswrapper[4693]: I1204 10:18:29.919934 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm87m\" (UniqueName: \"kubernetes.io/projected/5aeee95a-198f-47ed-859b-0f710da9768c-kube-api-access-xm87m\") pod \"5aeee95a-198f-47ed-859b-0f710da9768c\" (UID: \"5aeee95a-198f-47ed-859b-0f710da9768c\") " Dec 04 10:18:29 crc kubenswrapper[4693]: I1204 10:18:29.920272 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5aeee95a-198f-47ed-859b-0f710da9768c-inventory\") pod \"5aeee95a-198f-47ed-859b-0f710da9768c\" (UID: \"5aeee95a-198f-47ed-859b-0f710da9768c\") " Dec 04 10:18:29 crc kubenswrapper[4693]: I1204 10:18:29.920425 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5aeee95a-198f-47ed-859b-0f710da9768c-ssh-key\") pod \"5aeee95a-198f-47ed-859b-0f710da9768c\" (UID: \"5aeee95a-198f-47ed-859b-0f710da9768c\") " Dec 04 10:18:29 crc kubenswrapper[4693]: I1204 10:18:29.925928 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aeee95a-198f-47ed-859b-0f710da9768c-kube-api-access-xm87m" (OuterVolumeSpecName: "kube-api-access-xm87m") pod "5aeee95a-198f-47ed-859b-0f710da9768c" (UID: "5aeee95a-198f-47ed-859b-0f710da9768c"). InnerVolumeSpecName "kube-api-access-xm87m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:18:29 crc kubenswrapper[4693]: I1204 10:18:29.947701 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aeee95a-198f-47ed-859b-0f710da9768c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5aeee95a-198f-47ed-859b-0f710da9768c" (UID: "5aeee95a-198f-47ed-859b-0f710da9768c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:18:29 crc kubenswrapper[4693]: I1204 10:18:29.960865 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aeee95a-198f-47ed-859b-0f710da9768c-inventory" (OuterVolumeSpecName: "inventory") pod "5aeee95a-198f-47ed-859b-0f710da9768c" (UID: "5aeee95a-198f-47ed-859b-0f710da9768c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.022512 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5aeee95a-198f-47ed-859b-0f710da9768c-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.022540 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5aeee95a-198f-47ed-859b-0f710da9768c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.022550 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm87m\" (UniqueName: \"kubernetes.io/projected/5aeee95a-198f-47ed-859b-0f710da9768c-kube-api-access-xm87m\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.399019 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5" event={"ID":"5aeee95a-198f-47ed-859b-0f710da9768c","Type":"ContainerDied","Data":"1b813f87dfa2dce54e148fa3afbffc4b08184646590a554612826ac3bd45742d"} Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.399056 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b813f87dfa2dce54e148fa3afbffc4b08184646590a554612826ac3bd45742d" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.399099 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-994x5" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.479917 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn"] Dec 04 10:18:30 crc kubenswrapper[4693]: E1204 10:18:30.480449 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aeee95a-198f-47ed-859b-0f710da9768c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.480471 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aeee95a-198f-47ed-859b-0f710da9768c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.480746 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aeee95a-198f-47ed-859b-0f710da9768c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.481593 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.484691 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.485178 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.485829 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.486052 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzd9z" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.494108 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn"] Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.533834 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs5bd\" (UniqueName: \"kubernetes.io/projected/bbf583ab-d797-4781-a13e-d4d493483d3e-kube-api-access-zs5bd\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn\" (UID: \"bbf583ab-d797-4781-a13e-d4d493483d3e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.534899 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbf583ab-d797-4781-a13e-d4d493483d3e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn\" (UID: \"bbf583ab-d797-4781-a13e-d4d493483d3e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.534975 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bbf583ab-d797-4781-a13e-d4d493483d3e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn\" (UID: \"bbf583ab-d797-4781-a13e-d4d493483d3e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.637085 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs5bd\" (UniqueName: \"kubernetes.io/projected/bbf583ab-d797-4781-a13e-d4d493483d3e-kube-api-access-zs5bd\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn\" (UID: \"bbf583ab-d797-4781-a13e-d4d493483d3e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.637188 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbf583ab-d797-4781-a13e-d4d493483d3e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn\" (UID: \"bbf583ab-d797-4781-a13e-d4d493483d3e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.637213 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bbf583ab-d797-4781-a13e-d4d493483d3e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn\" (UID: \"bbf583ab-d797-4781-a13e-d4d493483d3e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.641264 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbf583ab-d797-4781-a13e-d4d493483d3e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn\" (UID: \"bbf583ab-d797-4781-a13e-d4d493483d3e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.649895 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bbf583ab-d797-4781-a13e-d4d493483d3e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn\" (UID: \"bbf583ab-d797-4781-a13e-d4d493483d3e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.652696 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs5bd\" (UniqueName: \"kubernetes.io/projected/bbf583ab-d797-4781-a13e-d4d493483d3e-kube-api-access-zs5bd\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn\" (UID: \"bbf583ab-d797-4781-a13e-d4d493483d3e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn" Dec 04 10:18:30 crc kubenswrapper[4693]: I1204 10:18:30.810890 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn" Dec 04 10:18:31 crc kubenswrapper[4693]: I1204 10:18:31.325092 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn"] Dec 04 10:18:31 crc kubenswrapper[4693]: I1204 10:18:31.407589 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn" event={"ID":"bbf583ab-d797-4781-a13e-d4d493483d3e","Type":"ContainerStarted","Data":"94e3a61d1317db49616fb90bf8dbf931f578fdd35bc15abf1becec911ed6f536"} Dec 04 10:18:32 crc kubenswrapper[4693]: I1204 10:18:32.417720 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn" event={"ID":"bbf583ab-d797-4781-a13e-d4d493483d3e","Type":"ContainerStarted","Data":"1a492421c9fd1276f50629c824fb586d4762f856771e6cad15bd36a18775862a"} Dec 04 10:18:36 crc kubenswrapper[4693]: I1204 10:18:36.455014 4693 generic.go:334] "Generic (PLEG): container finished" podID="bbf583ab-d797-4781-a13e-d4d493483d3e" containerID="1a492421c9fd1276f50629c824fb586d4762f856771e6cad15bd36a18775862a" exitCode=0 Dec 04 10:18:36 crc kubenswrapper[4693]: I1204 10:18:36.455143 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn" event={"ID":"bbf583ab-d797-4781-a13e-d4d493483d3e","Type":"ContainerDied","Data":"1a492421c9fd1276f50629c824fb586d4762f856771e6cad15bd36a18775862a"} Dec 04 10:18:37 crc kubenswrapper[4693]: I1204 10:18:37.865300 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn" Dec 04 10:18:37 crc kubenswrapper[4693]: I1204 10:18:37.996030 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bbf583ab-d797-4781-a13e-d4d493483d3e-ssh-key\") pod \"bbf583ab-d797-4781-a13e-d4d493483d3e\" (UID: \"bbf583ab-d797-4781-a13e-d4d493483d3e\") " Dec 04 10:18:37 crc kubenswrapper[4693]: I1204 10:18:37.996703 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs5bd\" (UniqueName: \"kubernetes.io/projected/bbf583ab-d797-4781-a13e-d4d493483d3e-kube-api-access-zs5bd\") pod \"bbf583ab-d797-4781-a13e-d4d493483d3e\" (UID: \"bbf583ab-d797-4781-a13e-d4d493483d3e\") " Dec 04 10:18:37 crc kubenswrapper[4693]: I1204 10:18:37.996828 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbf583ab-d797-4781-a13e-d4d493483d3e-inventory\") pod \"bbf583ab-d797-4781-a13e-d4d493483d3e\" (UID: \"bbf583ab-d797-4781-a13e-d4d493483d3e\") " Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.003461 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf583ab-d797-4781-a13e-d4d493483d3e-kube-api-access-zs5bd" (OuterVolumeSpecName: "kube-api-access-zs5bd") pod "bbf583ab-d797-4781-a13e-d4d493483d3e" (UID: "bbf583ab-d797-4781-a13e-d4d493483d3e"). InnerVolumeSpecName "kube-api-access-zs5bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.031043 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf583ab-d797-4781-a13e-d4d493483d3e-inventory" (OuterVolumeSpecName: "inventory") pod "bbf583ab-d797-4781-a13e-d4d493483d3e" (UID: "bbf583ab-d797-4781-a13e-d4d493483d3e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.037582 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf583ab-d797-4781-a13e-d4d493483d3e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bbf583ab-d797-4781-a13e-d4d493483d3e" (UID: "bbf583ab-d797-4781-a13e-d4d493483d3e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.099957 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbf583ab-d797-4781-a13e-d4d493483d3e-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.100182 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bbf583ab-d797-4781-a13e-d4d493483d3e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.100257 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs5bd\" (UniqueName: \"kubernetes.io/projected/bbf583ab-d797-4781-a13e-d4d493483d3e-kube-api-access-zs5bd\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.475946 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn" event={"ID":"bbf583ab-d797-4781-a13e-d4d493483d3e","Type":"ContainerDied","Data":"94e3a61d1317db49616fb90bf8dbf931f578fdd35bc15abf1becec911ed6f536"} Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.475993 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94e3a61d1317db49616fb90bf8dbf931f578fdd35bc15abf1becec911ed6f536" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.476058 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.544783 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n"] Dec 04 10:18:38 crc kubenswrapper[4693]: E1204 10:18:38.545233 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf583ab-d797-4781-a13e-d4d493483d3e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.545250 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf583ab-d797-4781-a13e-d4d493483d3e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.545482 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf583ab-d797-4781-a13e-d4d493483d3e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.546132 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.551030 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.551080 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.551115 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.551695 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzd9z" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.553026 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n"] Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.612540 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqfdf\" (UniqueName: \"kubernetes.io/projected/c8c97263-d34a-4231-9f52-5f0aae7163f2-kube-api-access-gqfdf\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cld2n\" (UID: \"c8c97263-d34a-4231-9f52-5f0aae7163f2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.612606 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8c97263-d34a-4231-9f52-5f0aae7163f2-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cld2n\" (UID: \"c8c97263-d34a-4231-9f52-5f0aae7163f2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.612785 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8c97263-d34a-4231-9f52-5f0aae7163f2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cld2n\" (UID: \"c8c97263-d34a-4231-9f52-5f0aae7163f2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.715234 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8c97263-d34a-4231-9f52-5f0aae7163f2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cld2n\" (UID: \"c8c97263-d34a-4231-9f52-5f0aae7163f2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.715358 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqfdf\" (UniqueName: \"kubernetes.io/projected/c8c97263-d34a-4231-9f52-5f0aae7163f2-kube-api-access-gqfdf\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cld2n\" (UID: \"c8c97263-d34a-4231-9f52-5f0aae7163f2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.715383 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8c97263-d34a-4231-9f52-5f0aae7163f2-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cld2n\" (UID: \"c8c97263-d34a-4231-9f52-5f0aae7163f2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.719081 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8c97263-d34a-4231-9f52-5f0aae7163f2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cld2n\" (UID: \"c8c97263-d34a-4231-9f52-5f0aae7163f2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.719098 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8c97263-d34a-4231-9f52-5f0aae7163f2-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cld2n\" (UID: \"c8c97263-d34a-4231-9f52-5f0aae7163f2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.731125 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqfdf\" (UniqueName: \"kubernetes.io/projected/c8c97263-d34a-4231-9f52-5f0aae7163f2-kube-api-access-gqfdf\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cld2n\" (UID: \"c8c97263-d34a-4231-9f52-5f0aae7163f2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n" Dec 04 10:18:38 crc kubenswrapper[4693]: I1204 10:18:38.875383 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n" Dec 04 10:18:39 crc kubenswrapper[4693]: I1204 10:18:39.397073 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n"] Dec 04 10:18:39 crc kubenswrapper[4693]: I1204 10:18:39.485194 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n" event={"ID":"c8c97263-d34a-4231-9f52-5f0aae7163f2","Type":"ContainerStarted","Data":"4cd61f596f345b7f60679b9fc81e80d73dd06c9998b6fc2c0180094a3ea11b06"} Dec 04 10:18:40 crc kubenswrapper[4693]: I1204 10:18:40.498539 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n" event={"ID":"c8c97263-d34a-4231-9f52-5f0aae7163f2","Type":"ContainerStarted","Data":"751e416a95f1a638ef5f8ad8cc95d6916b0d9f9bfe50ce81f3c3500c644477fd"} Dec 04 10:18:52 crc kubenswrapper[4693]: I1204 10:18:52.272661 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:18:52 crc kubenswrapper[4693]: I1204 10:18:52.273166 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:18:59 crc kubenswrapper[4693]: I1204 10:18:59.051054 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n" podStartSLOduration=20.627548084 podStartE2EDuration="21.051033095s" podCreationTimestamp="2025-12-04 10:18:38 +0000 UTC" firstStartedPulling="2025-12-04 10:18:39.401285375 +0000 UTC m=+2165.298879128" lastFinishedPulling="2025-12-04 10:18:39.824770396 +0000 UTC m=+2165.722364139" observedRunningTime="2025-12-04 10:18:40.524147651 +0000 UTC m=+2166.421741494" watchObservedRunningTime="2025-12-04 10:18:59.051033095 +0000 UTC m=+2184.948626848" Dec 04 10:18:59 crc kubenswrapper[4693]: I1204 10:18:59.054501 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-pp9r5"] Dec 04 10:18:59 crc kubenswrapper[4693]: I1204 10:18:59.064166 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-pp9r5"] Dec 04 10:19:00 crc kubenswrapper[4693]: I1204 10:19:00.472886 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f223e615-c099-4f70-b613-3f438d533326" path="/var/lib/kubelet/pods/f223e615-c099-4f70-b613-3f438d533326/volumes" Dec 04 10:19:10 crc kubenswrapper[4693]: I1204 10:19:10.107373 4693 scope.go:117] "RemoveContainer" containerID="a8bbc45cb758b8cc0c6d98a2d90f19d61fd7af8daf5de08f295908f76256a884" Dec 04 10:19:10 crc kubenswrapper[4693]: I1204 10:19:10.151117 4693 scope.go:117] "RemoveContainer" containerID="8fcbc0ab777db633d734638b6dfc6f50f372f62d7296cb8fd8788f20b86b027f" Dec 04 10:19:10 crc kubenswrapper[4693]: I1204 10:19:10.203273 4693 scope.go:117] "RemoveContainer" containerID="c530c346a3e1f67d5d8af849cfd15ff925733b6c421d92b1d21044479d983558" Dec 04 10:19:14 crc kubenswrapper[4693]: I1204 10:19:14.204675 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kj5tj"] Dec 04 10:19:14 crc kubenswrapper[4693]: I1204 10:19:14.206850 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kj5tj" Dec 04 10:19:14 crc kubenswrapper[4693]: I1204 10:19:14.223549 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kj5tj"] Dec 04 10:19:14 crc kubenswrapper[4693]: I1204 10:19:14.268197 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a-utilities\") pod \"community-operators-kj5tj\" (UID: \"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a\") " pod="openshift-marketplace/community-operators-kj5tj" Dec 04 10:19:14 crc kubenswrapper[4693]: I1204 10:19:14.268404 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a-catalog-content\") pod \"community-operators-kj5tj\" (UID: \"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a\") " pod="openshift-marketplace/community-operators-kj5tj" Dec 04 10:19:14 crc kubenswrapper[4693]: I1204 10:19:14.268482 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm6vf\" (UniqueName: \"kubernetes.io/projected/5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a-kube-api-access-rm6vf\") pod \"community-operators-kj5tj\" (UID: \"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a\") " pod="openshift-marketplace/community-operators-kj5tj" Dec 04 10:19:14 crc kubenswrapper[4693]: I1204 10:19:14.369828 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a-catalog-content\") pod \"community-operators-kj5tj\" (UID: \"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a\") " pod="openshift-marketplace/community-operators-kj5tj" Dec 04 10:19:14 crc kubenswrapper[4693]: I1204 10:19:14.369923 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm6vf\" (UniqueName: \"kubernetes.io/projected/5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a-kube-api-access-rm6vf\") pod \"community-operators-kj5tj\" (UID: \"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a\") " pod="openshift-marketplace/community-operators-kj5tj" Dec 04 10:19:14 crc kubenswrapper[4693]: I1204 10:19:14.369993 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a-utilities\") pod \"community-operators-kj5tj\" (UID: \"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a\") " pod="openshift-marketplace/community-operators-kj5tj" Dec 04 10:19:14 crc kubenswrapper[4693]: I1204 10:19:14.370379 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a-catalog-content\") pod \"community-operators-kj5tj\" (UID: \"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a\") " pod="openshift-marketplace/community-operators-kj5tj" Dec 04 10:19:14 crc kubenswrapper[4693]: I1204 10:19:14.370602 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a-utilities\") pod \"community-operators-kj5tj\" (UID: \"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a\") " pod="openshift-marketplace/community-operators-kj5tj" Dec 04 10:19:14 crc kubenswrapper[4693]: I1204 10:19:14.391197 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm6vf\" (UniqueName: \"kubernetes.io/projected/5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a-kube-api-access-rm6vf\") pod \"community-operators-kj5tj\" (UID: \"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a\") " pod="openshift-marketplace/community-operators-kj5tj" Dec 04 10:19:14 crc kubenswrapper[4693]: I1204 10:19:14.528504 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kj5tj" Dec 04 10:19:15 crc kubenswrapper[4693]: I1204 10:19:15.037533 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kj5tj"] Dec 04 10:19:15 crc kubenswrapper[4693]: I1204 10:19:15.806736 4693 generic.go:334] "Generic (PLEG): container finished" podID="5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a" containerID="ee93919cfd0793e592bfcde7c02afeb079dc64d6ddcf3795038e039cb5916809" exitCode=0 Dec 04 10:19:15 crc kubenswrapper[4693]: I1204 10:19:15.807987 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj5tj" event={"ID":"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a","Type":"ContainerDied","Data":"ee93919cfd0793e592bfcde7c02afeb079dc64d6ddcf3795038e039cb5916809"} Dec 04 10:19:15 crc kubenswrapper[4693]: I1204 10:19:15.808072 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj5tj" event={"ID":"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a","Type":"ContainerStarted","Data":"7c5bfbf98a7f4b3c26e2302bb8bea2f2371d0c923fcad8ddf75897ba63897160"} Dec 04 10:19:16 crc kubenswrapper[4693]: I1204 10:19:16.984042 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8rgql"] Dec 04 10:19:16 crc kubenswrapper[4693]: I1204 10:19:16.986818 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rgql" Dec 04 10:19:16 crc kubenswrapper[4693]: I1204 10:19:16.994045 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rgql"] Dec 04 10:19:17 crc kubenswrapper[4693]: I1204 10:19:17.034515 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88af83d2-3de8-429b-9fd0-6dd81ed68def-utilities\") pod \"certified-operators-8rgql\" (UID: \"88af83d2-3de8-429b-9fd0-6dd81ed68def\") " pod="openshift-marketplace/certified-operators-8rgql" Dec 04 10:19:17 crc kubenswrapper[4693]: I1204 10:19:17.034600 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88af83d2-3de8-429b-9fd0-6dd81ed68def-catalog-content\") pod \"certified-operators-8rgql\" (UID: \"88af83d2-3de8-429b-9fd0-6dd81ed68def\") " pod="openshift-marketplace/certified-operators-8rgql" Dec 04 10:19:17 crc kubenswrapper[4693]: I1204 10:19:17.034903 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4xl5\" (UniqueName: \"kubernetes.io/projected/88af83d2-3de8-429b-9fd0-6dd81ed68def-kube-api-access-f4xl5\") pod \"certified-operators-8rgql\" (UID: \"88af83d2-3de8-429b-9fd0-6dd81ed68def\") " pod="openshift-marketplace/certified-operators-8rgql" Dec 04 10:19:17 crc kubenswrapper[4693]: I1204 10:19:17.136842 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88af83d2-3de8-429b-9fd0-6dd81ed68def-catalog-content\") pod \"certified-operators-8rgql\" (UID: \"88af83d2-3de8-429b-9fd0-6dd81ed68def\") " pod="openshift-marketplace/certified-operators-8rgql" Dec 04 10:19:17 crc kubenswrapper[4693]: I1204 10:19:17.136959 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4xl5\" (UniqueName: \"kubernetes.io/projected/88af83d2-3de8-429b-9fd0-6dd81ed68def-kube-api-access-f4xl5\") pod \"certified-operators-8rgql\" (UID: \"88af83d2-3de8-429b-9fd0-6dd81ed68def\") " pod="openshift-marketplace/certified-operators-8rgql" Dec 04 10:19:17 crc kubenswrapper[4693]: I1204 10:19:17.137049 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88af83d2-3de8-429b-9fd0-6dd81ed68def-utilities\") pod \"certified-operators-8rgql\" (UID: \"88af83d2-3de8-429b-9fd0-6dd81ed68def\") " pod="openshift-marketplace/certified-operators-8rgql" Dec 04 10:19:17 crc kubenswrapper[4693]: I1204 10:19:17.137561 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88af83d2-3de8-429b-9fd0-6dd81ed68def-catalog-content\") pod \"certified-operators-8rgql\" (UID: \"88af83d2-3de8-429b-9fd0-6dd81ed68def\") " pod="openshift-marketplace/certified-operators-8rgql" Dec 04 10:19:17 crc kubenswrapper[4693]: I1204 10:19:17.137623 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88af83d2-3de8-429b-9fd0-6dd81ed68def-utilities\") pod \"certified-operators-8rgql\" (UID: \"88af83d2-3de8-429b-9fd0-6dd81ed68def\") " pod="openshift-marketplace/certified-operators-8rgql" Dec 04 10:19:17 crc kubenswrapper[4693]: I1204 10:19:17.159695 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4xl5\" (UniqueName: \"kubernetes.io/projected/88af83d2-3de8-429b-9fd0-6dd81ed68def-kube-api-access-f4xl5\") pod \"certified-operators-8rgql\" (UID: \"88af83d2-3de8-429b-9fd0-6dd81ed68def\") " pod="openshift-marketplace/certified-operators-8rgql" Dec 04 10:19:17 crc kubenswrapper[4693]: I1204 10:19:17.318407 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rgql" Dec 04 10:19:17 crc kubenswrapper[4693]: I1204 10:19:17.825566 4693 generic.go:334] "Generic (PLEG): container finished" podID="5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a" containerID="c2dbeb7075aa12666bf882f5f06805afe6376a522cf3066672503fe4575d05f2" exitCode=0 Dec 04 10:19:17 crc kubenswrapper[4693]: I1204 10:19:17.825638 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj5tj" event={"ID":"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a","Type":"ContainerDied","Data":"c2dbeb7075aa12666bf882f5f06805afe6376a522cf3066672503fe4575d05f2"} Dec 04 10:19:17 crc kubenswrapper[4693]: I1204 10:19:17.837343 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rgql"] Dec 04 10:19:17 crc kubenswrapper[4693]: W1204 10:19:17.846916 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88af83d2_3de8_429b_9fd0_6dd81ed68def.slice/crio-52496bd9ff255b80b6ec791cabeddcca727b3c899bab7b883a7c1663015d2c90 WatchSource:0}: Error finding container 52496bd9ff255b80b6ec791cabeddcca727b3c899bab7b883a7c1663015d2c90: Status 404 returned error can't find the container with id 52496bd9ff255b80b6ec791cabeddcca727b3c899bab7b883a7c1663015d2c90 Dec 04 10:19:18 crc kubenswrapper[4693]: I1204 10:19:18.866613 4693 generic.go:334] "Generic (PLEG): container finished" podID="88af83d2-3de8-429b-9fd0-6dd81ed68def" containerID="499a3d81eff476ae50ed4c6552ad59faf0ca40a7140d674fee4d6ae2ff8a1a72" exitCode=0 Dec 04 10:19:18 crc kubenswrapper[4693]: I1204 10:19:18.866763 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rgql" event={"ID":"88af83d2-3de8-429b-9fd0-6dd81ed68def","Type":"ContainerDied","Data":"499a3d81eff476ae50ed4c6552ad59faf0ca40a7140d674fee4d6ae2ff8a1a72"} Dec 04 10:19:18 crc kubenswrapper[4693]: I1204 10:19:18.867217 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rgql" event={"ID":"88af83d2-3de8-429b-9fd0-6dd81ed68def","Type":"ContainerStarted","Data":"52496bd9ff255b80b6ec791cabeddcca727b3c899bab7b883a7c1663015d2c90"} Dec 04 10:19:20 crc kubenswrapper[4693]: I1204 10:19:20.883407 4693 generic.go:334] "Generic (PLEG): container finished" podID="c8c97263-d34a-4231-9f52-5f0aae7163f2" containerID="751e416a95f1a638ef5f8ad8cc95d6916b0d9f9bfe50ce81f3c3500c644477fd" exitCode=0 Dec 04 10:19:20 crc kubenswrapper[4693]: I1204 10:19:20.883497 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n" event={"ID":"c8c97263-d34a-4231-9f52-5f0aae7163f2","Type":"ContainerDied","Data":"751e416a95f1a638ef5f8ad8cc95d6916b0d9f9bfe50ce81f3c3500c644477fd"} Dec 04 10:19:20 crc kubenswrapper[4693]: I1204 10:19:20.887572 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj5tj" event={"ID":"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a","Type":"ContainerStarted","Data":"1224eb80475c04852e4264fc71022aa18c789ebd70828ba40bb70b9d290987ea"} Dec 04 10:19:20 crc kubenswrapper[4693]: I1204 10:19:20.952120 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kj5tj" podStartSLOduration=4.056415006 podStartE2EDuration="6.952101214s" podCreationTimestamp="2025-12-04 10:19:14 +0000 UTC" firstStartedPulling="2025-12-04 10:19:15.810135858 +0000 UTC m=+2201.707729611" lastFinishedPulling="2025-12-04 10:19:18.705822066 +0000 UTC m=+2204.603415819" observedRunningTime="2025-12-04 10:19:20.94459247 +0000 UTC m=+2206.842186243" watchObservedRunningTime="2025-12-04 10:19:20.952101214 +0000 UTC m=+2206.849694967" Dec 04 10:19:21 crc kubenswrapper[4693]: I1204 10:19:21.902432 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rgql" event={"ID":"88af83d2-3de8-429b-9fd0-6dd81ed68def","Type":"ContainerStarted","Data":"98839aee3c5fb933bd8e5e5a62225aaf73f7a7d306fad333dcefa5723406ae85"} Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.272982 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.273449 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.273508 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.274284 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cb78c8a8470bb98259f762ceb5868f195a5cc40bc0fad1b334b0581114dcc9f0"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.274362 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://cb78c8a8470bb98259f762ceb5868f195a5cc40bc0fad1b334b0581114dcc9f0" gracePeriod=600 Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.310236 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n" Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.343709 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqfdf\" (UniqueName: \"kubernetes.io/projected/c8c97263-d34a-4231-9f52-5f0aae7163f2-kube-api-access-gqfdf\") pod \"c8c97263-d34a-4231-9f52-5f0aae7163f2\" (UID: \"c8c97263-d34a-4231-9f52-5f0aae7163f2\") " Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.343790 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8c97263-d34a-4231-9f52-5f0aae7163f2-ssh-key\") pod \"c8c97263-d34a-4231-9f52-5f0aae7163f2\" (UID: \"c8c97263-d34a-4231-9f52-5f0aae7163f2\") " Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.344388 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8c97263-d34a-4231-9f52-5f0aae7163f2-inventory\") pod \"c8c97263-d34a-4231-9f52-5f0aae7163f2\" (UID: \"c8c97263-d34a-4231-9f52-5f0aae7163f2\") " Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.356878 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8c97263-d34a-4231-9f52-5f0aae7163f2-kube-api-access-gqfdf" (OuterVolumeSpecName: "kube-api-access-gqfdf") pod "c8c97263-d34a-4231-9f52-5f0aae7163f2" (UID: "c8c97263-d34a-4231-9f52-5f0aae7163f2"). InnerVolumeSpecName "kube-api-access-gqfdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.376873 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8c97263-d34a-4231-9f52-5f0aae7163f2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c8c97263-d34a-4231-9f52-5f0aae7163f2" (UID: "c8c97263-d34a-4231-9f52-5f0aae7163f2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.381943 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8c97263-d34a-4231-9f52-5f0aae7163f2-inventory" (OuterVolumeSpecName: "inventory") pod "c8c97263-d34a-4231-9f52-5f0aae7163f2" (UID: "c8c97263-d34a-4231-9f52-5f0aae7163f2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.447707 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8c97263-d34a-4231-9f52-5f0aae7163f2-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.447753 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqfdf\" (UniqueName: \"kubernetes.io/projected/c8c97263-d34a-4231-9f52-5f0aae7163f2-kube-api-access-gqfdf\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.447769 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8c97263-d34a-4231-9f52-5f0aae7163f2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.913241 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n" event={"ID":"c8c97263-d34a-4231-9f52-5f0aae7163f2","Type":"ContainerDied","Data":"4cd61f596f345b7f60679b9fc81e80d73dd06c9998b6fc2c0180094a3ea11b06"} Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.913308 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cd61f596f345b7f60679b9fc81e80d73dd06c9998b6fc2c0180094a3ea11b06" Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.913269 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cld2n" Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.915529 4693 generic.go:334] "Generic (PLEG): container finished" podID="88af83d2-3de8-429b-9fd0-6dd81ed68def" containerID="98839aee3c5fb933bd8e5e5a62225aaf73f7a7d306fad333dcefa5723406ae85" exitCode=0 Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.915568 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rgql" event={"ID":"88af83d2-3de8-429b-9fd0-6dd81ed68def","Type":"ContainerDied","Data":"98839aee3c5fb933bd8e5e5a62225aaf73f7a7d306fad333dcefa5723406ae85"} Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.999437 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q"] Dec 04 10:19:22 crc kubenswrapper[4693]: E1204 10:19:22.999822 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c97263-d34a-4231-9f52-5f0aae7163f2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:19:22 crc kubenswrapper[4693]: I1204 10:19:22.999841 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c97263-d34a-4231-9f52-5f0aae7163f2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:19:23 crc kubenswrapper[4693]: I1204 10:19:23.000072 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8c97263-d34a-4231-9f52-5f0aae7163f2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:19:23 crc kubenswrapper[4693]: I1204 10:19:23.000738 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q" Dec 04 10:19:23 crc kubenswrapper[4693]: I1204 10:19:23.002876 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:19:23 crc kubenswrapper[4693]: I1204 10:19:23.003136 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzd9z" Dec 04 10:19:23 crc kubenswrapper[4693]: I1204 10:19:23.003145 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:19:23 crc kubenswrapper[4693]: I1204 10:19:23.004828 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:19:23 crc kubenswrapper[4693]: I1204 10:19:23.011758 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q"] Dec 04 10:19:23 crc kubenswrapper[4693]: I1204 10:19:23.067469 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9lxj\" (UniqueName: \"kubernetes.io/projected/b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6-kube-api-access-h9lxj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2p64q\" (UID: \"b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q" Dec 04 10:19:23 crc kubenswrapper[4693]: I1204 10:19:23.067536 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2p64q\" (UID: \"b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q" Dec 04 10:19:23 crc kubenswrapper[4693]: I1204 10:19:23.067806 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2p64q\" (UID: \"b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q" Dec 04 10:19:23 crc kubenswrapper[4693]: I1204 10:19:23.169959 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9lxj\" (UniqueName: \"kubernetes.io/projected/b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6-kube-api-access-h9lxj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2p64q\" (UID: \"b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q" Dec 04 10:19:23 crc kubenswrapper[4693]: I1204 10:19:23.170009 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2p64q\" (UID: \"b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q" Dec 04 10:19:23 crc kubenswrapper[4693]: I1204 10:19:23.170064 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2p64q\" (UID: \"b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q" Dec 04 10:19:23 crc kubenswrapper[4693]: I1204 10:19:23.175835 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2p64q\" (UID: \"b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q" Dec 04 10:19:23 crc kubenswrapper[4693]: I1204 10:19:23.183776 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2p64q\" (UID: \"b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q" Dec 04 10:19:23 crc kubenswrapper[4693]: I1204 10:19:23.187109 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9lxj\" (UniqueName: \"kubernetes.io/projected/b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6-kube-api-access-h9lxj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2p64q\" (UID: \"b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q" Dec 04 10:19:23 crc kubenswrapper[4693]: I1204 10:19:23.348712 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q" Dec 04 10:19:23 crc kubenswrapper[4693]: I1204 10:19:23.867597 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q"] Dec 04 10:19:23 crc kubenswrapper[4693]: W1204 10:19:23.873953 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9ee0ede_3c46_4a69_b2e5_00901d0ee8a6.slice/crio-e6374a213c3d08ba4dfe1036ef05124945cf58fcc51aae979ea655d368c35b35 WatchSource:0}: Error finding container e6374a213c3d08ba4dfe1036ef05124945cf58fcc51aae979ea655d368c35b35: Status 404 returned error can't find the container with id e6374a213c3d08ba4dfe1036ef05124945cf58fcc51aae979ea655d368c35b35 Dec 04 10:19:23 crc kubenswrapper[4693]: I1204 10:19:23.928034 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q" event={"ID":"b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6","Type":"ContainerStarted","Data":"e6374a213c3d08ba4dfe1036ef05124945cf58fcc51aae979ea655d368c35b35"} Dec 04 10:19:24 crc kubenswrapper[4693]: I1204 10:19:24.528967 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kj5tj" Dec 04 10:19:24 crc kubenswrapper[4693]: I1204 10:19:24.529015 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kj5tj" Dec 04 10:19:24 crc kubenswrapper[4693]: I1204 10:19:24.576089 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kj5tj" Dec 04 10:19:24 crc kubenswrapper[4693]: I1204 10:19:24.979777 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kj5tj" Dec 04 10:19:25 crc kubenswrapper[4693]: I1204 10:19:25.850584 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kj5tj"] Dec 04 10:19:25 crc kubenswrapper[4693]: I1204 10:19:25.979077 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="cb78c8a8470bb98259f762ceb5868f195a5cc40bc0fad1b334b0581114dcc9f0" exitCode=0 Dec 04 10:19:25 crc kubenswrapper[4693]: I1204 10:19:25.979159 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"cb78c8a8470bb98259f762ceb5868f195a5cc40bc0fad1b334b0581114dcc9f0"} Dec 04 10:19:25 crc kubenswrapper[4693]: I1204 10:19:25.979239 4693 scope.go:117] "RemoveContainer" containerID="b3b9dd090997b0edb81826dfc3ccf6a598b948c0ec4267af6a27a94beec68316" Dec 04 10:19:26 crc kubenswrapper[4693]: I1204 10:19:26.989949 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rgql" event={"ID":"88af83d2-3de8-429b-9fd0-6dd81ed68def","Type":"ContainerStarted","Data":"5392fd31661273206361a9899c03acc9c961c543427fbf2618ea628e51038177"} Dec 04 10:19:26 crc kubenswrapper[4693]: I1204 10:19:26.994707 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b"} Dec 04 10:19:26 crc kubenswrapper[4693]: I1204 10:19:26.998893 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q" event={"ID":"b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6","Type":"ContainerStarted","Data":"a73db57b4c47aeaf6cbc9e94f64dd99c09c4eeda7202cc745f3061f4ddffaaed"} Dec 04 10:19:26 crc kubenswrapper[4693]: I1204 10:19:26.999021 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kj5tj" podUID="5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a" containerName="registry-server" containerID="cri-o://1224eb80475c04852e4264fc71022aa18c789ebd70828ba40bb70b9d290987ea" gracePeriod=2 Dec 04 10:19:27 crc kubenswrapper[4693]: I1204 10:19:27.010393 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8rgql" podStartSLOduration=4.257882081 podStartE2EDuration="11.010376372s" podCreationTimestamp="2025-12-04 10:19:16 +0000 UTC" firstStartedPulling="2025-12-04 10:19:18.868968865 +0000 UTC m=+2204.766562618" lastFinishedPulling="2025-12-04 10:19:25.621463136 +0000 UTC m=+2211.519056909" observedRunningTime="2025-12-04 10:19:27.006848481 +0000 UTC m=+2212.904442254" watchObservedRunningTime="2025-12-04 10:19:27.010376372 +0000 UTC m=+2212.907970125" Dec 04 10:19:27 crc kubenswrapper[4693]: I1204 10:19:27.055613 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q" podStartSLOduration=3.113505025 podStartE2EDuration="5.055585482s" podCreationTimestamp="2025-12-04 10:19:22 +0000 UTC" firstStartedPulling="2025-12-04 10:19:23.877122315 +0000 UTC m=+2209.774716058" lastFinishedPulling="2025-12-04 10:19:25.819202762 +0000 UTC m=+2211.716796515" observedRunningTime="2025-12-04 10:19:27.043262066 +0000 UTC m=+2212.940855819" watchObservedRunningTime="2025-12-04 10:19:27.055585482 +0000 UTC m=+2212.953179235" Dec 04 10:19:27 crc kubenswrapper[4693]: I1204 10:19:27.319503 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8rgql" Dec 04 10:19:27 crc kubenswrapper[4693]: I1204 10:19:27.319894 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8rgql" Dec 04 10:19:28 crc kubenswrapper[4693]: I1204 10:19:28.013455 4693 generic.go:334] "Generic (PLEG): container finished" podID="5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a" containerID="1224eb80475c04852e4264fc71022aa18c789ebd70828ba40bb70b9d290987ea" exitCode=0 Dec 04 10:19:28 crc kubenswrapper[4693]: I1204 10:19:28.013509 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj5tj" event={"ID":"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a","Type":"ContainerDied","Data":"1224eb80475c04852e4264fc71022aa18c789ebd70828ba40bb70b9d290987ea"} Dec 04 10:19:28 crc kubenswrapper[4693]: I1204 10:19:28.380371 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-8rgql" podUID="88af83d2-3de8-429b-9fd0-6dd81ed68def" containerName="registry-server" probeResult="failure" output=< Dec 04 10:19:28 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Dec 04 10:19:28 crc kubenswrapper[4693]: > Dec 04 10:19:29 crc kubenswrapper[4693]: I1204 10:19:29.254230 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kj5tj" Dec 04 10:19:29 crc kubenswrapper[4693]: I1204 10:19:29.325993 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a-catalog-content\") pod \"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a\" (UID: \"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a\") " Dec 04 10:19:29 crc kubenswrapper[4693]: I1204 10:19:29.326112 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm6vf\" (UniqueName: \"kubernetes.io/projected/5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a-kube-api-access-rm6vf\") pod \"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a\" (UID: \"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a\") " Dec 04 10:19:29 crc kubenswrapper[4693]: I1204 10:19:29.326228 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a-utilities\") pod \"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a\" (UID: \"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a\") " Dec 04 10:19:29 crc kubenswrapper[4693]: I1204 10:19:29.326931 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a-utilities" (OuterVolumeSpecName: "utilities") pod "5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a" (UID: "5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:19:29 crc kubenswrapper[4693]: I1204 10:19:29.327181 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:29 crc kubenswrapper[4693]: I1204 10:19:29.339728 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a-kube-api-access-rm6vf" (OuterVolumeSpecName: "kube-api-access-rm6vf") pod "5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a" (UID: "5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a"). InnerVolumeSpecName "kube-api-access-rm6vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:19:29 crc kubenswrapper[4693]: I1204 10:19:29.382348 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a" (UID: "5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:19:29 crc kubenswrapper[4693]: I1204 10:19:29.429620 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:29 crc kubenswrapper[4693]: I1204 10:19:29.429662 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm6vf\" (UniqueName: \"kubernetes.io/projected/5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a-kube-api-access-rm6vf\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:30 crc kubenswrapper[4693]: I1204 10:19:30.033663 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kj5tj" event={"ID":"5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a","Type":"ContainerDied","Data":"7c5bfbf98a7f4b3c26e2302bb8bea2f2371d0c923fcad8ddf75897ba63897160"} Dec 04 10:19:30 crc kubenswrapper[4693]: I1204 10:19:30.034005 4693 scope.go:117] "RemoveContainer" containerID="1224eb80475c04852e4264fc71022aa18c789ebd70828ba40bb70b9d290987ea" Dec 04 10:19:30 crc kubenswrapper[4693]: I1204 10:19:30.033781 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kj5tj" Dec 04 10:19:30 crc kubenswrapper[4693]: I1204 10:19:30.062248 4693 scope.go:117] "RemoveContainer" containerID="c2dbeb7075aa12666bf882f5f06805afe6376a522cf3066672503fe4575d05f2" Dec 04 10:19:30 crc kubenswrapper[4693]: I1204 10:19:30.071923 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kj5tj"] Dec 04 10:19:30 crc kubenswrapper[4693]: I1204 10:19:30.085374 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kj5tj"] Dec 04 10:19:30 crc kubenswrapper[4693]: I1204 10:19:30.088352 4693 scope.go:117] "RemoveContainer" containerID="ee93919cfd0793e592bfcde7c02afeb079dc64d6ddcf3795038e039cb5916809" Dec 04 10:19:30 crc kubenswrapper[4693]: I1204 10:19:30.475234 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a" path="/var/lib/kubelet/pods/5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a/volumes" Dec 04 10:19:37 crc kubenswrapper[4693]: I1204 10:19:37.366466 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8rgql" Dec 04 10:19:37 crc kubenswrapper[4693]: I1204 10:19:37.416154 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8rgql" Dec 04 10:19:37 crc kubenswrapper[4693]: I1204 10:19:37.606690 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rgql"] Dec 04 10:19:39 crc kubenswrapper[4693]: I1204 10:19:39.120120 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8rgql" podUID="88af83d2-3de8-429b-9fd0-6dd81ed68def" containerName="registry-server" containerID="cri-o://5392fd31661273206361a9899c03acc9c961c543427fbf2618ea628e51038177" gracePeriod=2 Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.097085 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rgql" Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.134114 4693 generic.go:334] "Generic (PLEG): container finished" podID="88af83d2-3de8-429b-9fd0-6dd81ed68def" containerID="5392fd31661273206361a9899c03acc9c961c543427fbf2618ea628e51038177" exitCode=0 Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.134212 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rgql" event={"ID":"88af83d2-3de8-429b-9fd0-6dd81ed68def","Type":"ContainerDied","Data":"5392fd31661273206361a9899c03acc9c961c543427fbf2618ea628e51038177"} Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.134245 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rgql" event={"ID":"88af83d2-3de8-429b-9fd0-6dd81ed68def","Type":"ContainerDied","Data":"52496bd9ff255b80b6ec791cabeddcca727b3c899bab7b883a7c1663015d2c90"} Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.134282 4693 scope.go:117] "RemoveContainer" containerID="5392fd31661273206361a9899c03acc9c961c543427fbf2618ea628e51038177" Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.134526 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rgql" Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.159264 4693 scope.go:117] "RemoveContainer" containerID="98839aee3c5fb933bd8e5e5a62225aaf73f7a7d306fad333dcefa5723406ae85" Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.171520 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4xl5\" (UniqueName: \"kubernetes.io/projected/88af83d2-3de8-429b-9fd0-6dd81ed68def-kube-api-access-f4xl5\") pod \"88af83d2-3de8-429b-9fd0-6dd81ed68def\" (UID: \"88af83d2-3de8-429b-9fd0-6dd81ed68def\") " Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.171680 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88af83d2-3de8-429b-9fd0-6dd81ed68def-utilities\") pod \"88af83d2-3de8-429b-9fd0-6dd81ed68def\" (UID: \"88af83d2-3de8-429b-9fd0-6dd81ed68def\") " Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.171999 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88af83d2-3de8-429b-9fd0-6dd81ed68def-catalog-content\") pod \"88af83d2-3de8-429b-9fd0-6dd81ed68def\" (UID: \"88af83d2-3de8-429b-9fd0-6dd81ed68def\") " Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.174080 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88af83d2-3de8-429b-9fd0-6dd81ed68def-utilities" (OuterVolumeSpecName: "utilities") pod "88af83d2-3de8-429b-9fd0-6dd81ed68def" (UID: "88af83d2-3de8-429b-9fd0-6dd81ed68def"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.179284 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88af83d2-3de8-429b-9fd0-6dd81ed68def-kube-api-access-f4xl5" (OuterVolumeSpecName: "kube-api-access-f4xl5") pod "88af83d2-3de8-429b-9fd0-6dd81ed68def" (UID: "88af83d2-3de8-429b-9fd0-6dd81ed68def"). InnerVolumeSpecName "kube-api-access-f4xl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.183214 4693 scope.go:117] "RemoveContainer" containerID="499a3d81eff476ae50ed4c6552ad59faf0ca40a7140d674fee4d6ae2ff8a1a72" Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.223568 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88af83d2-3de8-429b-9fd0-6dd81ed68def-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88af83d2-3de8-429b-9fd0-6dd81ed68def" (UID: "88af83d2-3de8-429b-9fd0-6dd81ed68def"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.272471 4693 scope.go:117] "RemoveContainer" containerID="5392fd31661273206361a9899c03acc9c961c543427fbf2618ea628e51038177" Dec 04 10:19:40 crc kubenswrapper[4693]: E1204 10:19:40.273208 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5392fd31661273206361a9899c03acc9c961c543427fbf2618ea628e51038177\": container with ID starting with 5392fd31661273206361a9899c03acc9c961c543427fbf2618ea628e51038177 not found: ID does not exist" containerID="5392fd31661273206361a9899c03acc9c961c543427fbf2618ea628e51038177" Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.273242 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5392fd31661273206361a9899c03acc9c961c543427fbf2618ea628e51038177"} err="failed to get container status \"5392fd31661273206361a9899c03acc9c961c543427fbf2618ea628e51038177\": rpc error: code = NotFound desc = could not find container \"5392fd31661273206361a9899c03acc9c961c543427fbf2618ea628e51038177\": container with ID starting with 5392fd31661273206361a9899c03acc9c961c543427fbf2618ea628e51038177 not found: ID does not exist" Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.273280 4693 scope.go:117] "RemoveContainer" containerID="98839aee3c5fb933bd8e5e5a62225aaf73f7a7d306fad333dcefa5723406ae85" Dec 04 10:19:40 crc kubenswrapper[4693]: E1204 10:19:40.273657 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98839aee3c5fb933bd8e5e5a62225aaf73f7a7d306fad333dcefa5723406ae85\": container with ID starting with 98839aee3c5fb933bd8e5e5a62225aaf73f7a7d306fad333dcefa5723406ae85 not found: ID does not exist" containerID="98839aee3c5fb933bd8e5e5a62225aaf73f7a7d306fad333dcefa5723406ae85" Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.273702 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98839aee3c5fb933bd8e5e5a62225aaf73f7a7d306fad333dcefa5723406ae85"} err="failed to get container status \"98839aee3c5fb933bd8e5e5a62225aaf73f7a7d306fad333dcefa5723406ae85\": rpc error: code = NotFound desc = could not find container \"98839aee3c5fb933bd8e5e5a62225aaf73f7a7d306fad333dcefa5723406ae85\": container with ID starting with 98839aee3c5fb933bd8e5e5a62225aaf73f7a7d306fad333dcefa5723406ae85 not found: ID does not exist" Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.273717 4693 scope.go:117] "RemoveContainer" containerID="499a3d81eff476ae50ed4c6552ad59faf0ca40a7140d674fee4d6ae2ff8a1a72" Dec 04 10:19:40 crc kubenswrapper[4693]: E1204 10:19:40.274189 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"499a3d81eff476ae50ed4c6552ad59faf0ca40a7140d674fee4d6ae2ff8a1a72\": container with ID starting with 499a3d81eff476ae50ed4c6552ad59faf0ca40a7140d674fee4d6ae2ff8a1a72 not found: ID does not exist" containerID="499a3d81eff476ae50ed4c6552ad59faf0ca40a7140d674fee4d6ae2ff8a1a72" Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.274247 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"499a3d81eff476ae50ed4c6552ad59faf0ca40a7140d674fee4d6ae2ff8a1a72"} err="failed to get container status \"499a3d81eff476ae50ed4c6552ad59faf0ca40a7140d674fee4d6ae2ff8a1a72\": rpc error: code = NotFound desc = could not find container \"499a3d81eff476ae50ed4c6552ad59faf0ca40a7140d674fee4d6ae2ff8a1a72\": container with ID starting with 499a3d81eff476ae50ed4c6552ad59faf0ca40a7140d674fee4d6ae2ff8a1a72 not found: ID does not exist" Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.274612 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4xl5\" (UniqueName: \"kubernetes.io/projected/88af83d2-3de8-429b-9fd0-6dd81ed68def-kube-api-access-f4xl5\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.274650 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88af83d2-3de8-429b-9fd0-6dd81ed68def-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.274664 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88af83d2-3de8-429b-9fd0-6dd81ed68def-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.471663 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rgql"] Dec 04 10:19:40 crc kubenswrapper[4693]: I1204 10:19:40.476073 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8rgql"] Dec 04 10:19:42 crc kubenswrapper[4693]: I1204 10:19:42.478079 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88af83d2-3de8-429b-9fd0-6dd81ed68def" path="/var/lib/kubelet/pods/88af83d2-3de8-429b-9fd0-6dd81ed68def/volumes" Dec 04 10:20:17 crc kubenswrapper[4693]: I1204 10:20:17.510245 4693 generic.go:334] "Generic (PLEG): container finished" podID="b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6" containerID="a73db57b4c47aeaf6cbc9e94f64dd99c09c4eeda7202cc745f3061f4ddffaaed" exitCode=0 Dec 04 10:20:17 crc kubenswrapper[4693]: I1204 10:20:17.510432 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q" event={"ID":"b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6","Type":"ContainerDied","Data":"a73db57b4c47aeaf6cbc9e94f64dd99c09c4eeda7202cc745f3061f4ddffaaed"} Dec 04 10:20:18 crc kubenswrapper[4693]: I1204 10:20:18.929222 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.079574 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9lxj\" (UniqueName: \"kubernetes.io/projected/b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6-kube-api-access-h9lxj\") pod \"b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6\" (UID: \"b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6\") " Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.079794 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6-ssh-key\") pod \"b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6\" (UID: \"b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6\") " Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.079946 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6-inventory\") pod \"b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6\" (UID: \"b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6\") " Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.088438 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6-kube-api-access-h9lxj" (OuterVolumeSpecName: "kube-api-access-h9lxj") pod "b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6" (UID: "b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6"). InnerVolumeSpecName "kube-api-access-h9lxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.112697 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6" (UID: "b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.115816 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6-inventory" (OuterVolumeSpecName: "inventory") pod "b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6" (UID: "b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.182835 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9lxj\" (UniqueName: \"kubernetes.io/projected/b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6-kube-api-access-h9lxj\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.182871 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.182883 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.529301 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q" event={"ID":"b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6","Type":"ContainerDied","Data":"e6374a213c3d08ba4dfe1036ef05124945cf58fcc51aae979ea655d368c35b35"} Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.529620 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6374a213c3d08ba4dfe1036ef05124945cf58fcc51aae979ea655d368c35b35" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.529361 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2p64q" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.620156 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-tkkcn"] Dec 04 10:20:19 crc kubenswrapper[4693]: E1204 10:20:19.620562 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88af83d2-3de8-429b-9fd0-6dd81ed68def" containerName="extract-content" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.620584 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="88af83d2-3de8-429b-9fd0-6dd81ed68def" containerName="extract-content" Dec 04 10:20:19 crc kubenswrapper[4693]: E1204 10:20:19.620597 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88af83d2-3de8-429b-9fd0-6dd81ed68def" containerName="extract-utilities" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.620604 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="88af83d2-3de8-429b-9fd0-6dd81ed68def" containerName="extract-utilities" Dec 04 10:20:19 crc kubenswrapper[4693]: E1204 10:20:19.620617 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88af83d2-3de8-429b-9fd0-6dd81ed68def" containerName="registry-server" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.620624 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="88af83d2-3de8-429b-9fd0-6dd81ed68def" containerName="registry-server" Dec 04 10:20:19 crc kubenswrapper[4693]: E1204 10:20:19.620652 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a" containerName="extract-content" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.620658 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a" containerName="extract-content" Dec 04 10:20:19 crc kubenswrapper[4693]: E1204 10:20:19.620670 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a" containerName="registry-server" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.620677 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a" containerName="registry-server" Dec 04 10:20:19 crc kubenswrapper[4693]: E1204 10:20:19.620690 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a" containerName="extract-utilities" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.620696 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a" containerName="extract-utilities" Dec 04 10:20:19 crc kubenswrapper[4693]: E1204 10:20:19.620713 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.620720 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.620965 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="88af83d2-3de8-429b-9fd0-6dd81ed68def" containerName="registry-server" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.620997 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f0a4adc-412e-4fcf-8695-00b5a8bcdd6a" containerName="registry-server" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.621024 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.621715 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-tkkcn" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.716137 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ff072e20-bb88-4bc8-8e07-a12912774161-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-tkkcn\" (UID: \"ff072e20-bb88-4bc8-8e07-a12912774161\") " pod="openstack/ssh-known-hosts-edpm-deployment-tkkcn" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.716289 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff072e20-bb88-4bc8-8e07-a12912774161-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-tkkcn\" (UID: \"ff072e20-bb88-4bc8-8e07-a12912774161\") " pod="openstack/ssh-known-hosts-edpm-deployment-tkkcn" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.716363 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxgmr\" (UniqueName: \"kubernetes.io/projected/ff072e20-bb88-4bc8-8e07-a12912774161-kube-api-access-hxgmr\") pod \"ssh-known-hosts-edpm-deployment-tkkcn\" (UID: \"ff072e20-bb88-4bc8-8e07-a12912774161\") " pod="openstack/ssh-known-hosts-edpm-deployment-tkkcn" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.717938 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.735974 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.735973 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.737951 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzd9z" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.759902 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-tkkcn"] Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.818030 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff072e20-bb88-4bc8-8e07-a12912774161-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-tkkcn\" (UID: \"ff072e20-bb88-4bc8-8e07-a12912774161\") " pod="openstack/ssh-known-hosts-edpm-deployment-tkkcn" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.818147 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxgmr\" (UniqueName: \"kubernetes.io/projected/ff072e20-bb88-4bc8-8e07-a12912774161-kube-api-access-hxgmr\") pod \"ssh-known-hosts-edpm-deployment-tkkcn\" (UID: \"ff072e20-bb88-4bc8-8e07-a12912774161\") " pod="openstack/ssh-known-hosts-edpm-deployment-tkkcn" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.818247 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ff072e20-bb88-4bc8-8e07-a12912774161-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-tkkcn\" (UID: \"ff072e20-bb88-4bc8-8e07-a12912774161\") " pod="openstack/ssh-known-hosts-edpm-deployment-tkkcn" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.827719 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff072e20-bb88-4bc8-8e07-a12912774161-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-tkkcn\" (UID: \"ff072e20-bb88-4bc8-8e07-a12912774161\") " pod="openstack/ssh-known-hosts-edpm-deployment-tkkcn" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.835698 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ff072e20-bb88-4bc8-8e07-a12912774161-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-tkkcn\" (UID: \"ff072e20-bb88-4bc8-8e07-a12912774161\") " pod="openstack/ssh-known-hosts-edpm-deployment-tkkcn" Dec 04 10:20:19 crc kubenswrapper[4693]: I1204 10:20:19.837404 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxgmr\" (UniqueName: \"kubernetes.io/projected/ff072e20-bb88-4bc8-8e07-a12912774161-kube-api-access-hxgmr\") pod \"ssh-known-hosts-edpm-deployment-tkkcn\" (UID: \"ff072e20-bb88-4bc8-8e07-a12912774161\") " pod="openstack/ssh-known-hosts-edpm-deployment-tkkcn" Dec 04 10:20:20 crc kubenswrapper[4693]: I1204 10:20:20.048525 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-tkkcn" Dec 04 10:20:20 crc kubenswrapper[4693]: I1204 10:20:20.540088 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-tkkcn"] Dec 04 10:20:21 crc kubenswrapper[4693]: I1204 10:20:21.551026 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-tkkcn" event={"ID":"ff072e20-bb88-4bc8-8e07-a12912774161","Type":"ContainerStarted","Data":"23491993f20289c6e1dd66ac10fbf3e80b9c70fec2ab74ef6e501c248ede370c"} Dec 04 10:20:21 crc kubenswrapper[4693]: I1204 10:20:21.552041 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-tkkcn" event={"ID":"ff072e20-bb88-4bc8-8e07-a12912774161","Type":"ContainerStarted","Data":"90bcbcf2264e42897944bc9a46c10fb8856662bb9c3fe0eb49f9d3f6a9a9b72c"} Dec 04 10:20:21 crc kubenswrapper[4693]: I1204 10:20:21.582957 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-tkkcn" podStartSLOduration=2.179694344 podStartE2EDuration="2.582937426s" podCreationTimestamp="2025-12-04 10:20:19 +0000 UTC" firstStartedPulling="2025-12-04 10:20:20.550994084 +0000 UTC m=+2266.448587837" lastFinishedPulling="2025-12-04 10:20:20.954237166 +0000 UTC m=+2266.851830919" observedRunningTime="2025-12-04 10:20:21.573486153 +0000 UTC m=+2267.471079906" watchObservedRunningTime="2025-12-04 10:20:21.582937426 +0000 UTC m=+2267.480531179" Dec 04 10:20:28 crc kubenswrapper[4693]: I1204 10:20:28.608631 4693 generic.go:334] "Generic (PLEG): container finished" podID="ff072e20-bb88-4bc8-8e07-a12912774161" containerID="23491993f20289c6e1dd66ac10fbf3e80b9c70fec2ab74ef6e501c248ede370c" exitCode=0 Dec 04 10:20:28 crc kubenswrapper[4693]: I1204 10:20:28.608821 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-tkkcn" event={"ID":"ff072e20-bb88-4bc8-8e07-a12912774161","Type":"ContainerDied","Data":"23491993f20289c6e1dd66ac10fbf3e80b9c70fec2ab74ef6e501c248ede370c"} Dec 04 10:20:29 crc kubenswrapper[4693]: I1204 10:20:29.992697 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-tkkcn" Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.150623 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff072e20-bb88-4bc8-8e07-a12912774161-ssh-key-openstack-edpm-ipam\") pod \"ff072e20-bb88-4bc8-8e07-a12912774161\" (UID: \"ff072e20-bb88-4bc8-8e07-a12912774161\") " Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.150728 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ff072e20-bb88-4bc8-8e07-a12912774161-inventory-0\") pod \"ff072e20-bb88-4bc8-8e07-a12912774161\" (UID: \"ff072e20-bb88-4bc8-8e07-a12912774161\") " Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.151672 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxgmr\" (UniqueName: \"kubernetes.io/projected/ff072e20-bb88-4bc8-8e07-a12912774161-kube-api-access-hxgmr\") pod \"ff072e20-bb88-4bc8-8e07-a12912774161\" (UID: \"ff072e20-bb88-4bc8-8e07-a12912774161\") " Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.156304 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff072e20-bb88-4bc8-8e07-a12912774161-kube-api-access-hxgmr" (OuterVolumeSpecName: "kube-api-access-hxgmr") pod "ff072e20-bb88-4bc8-8e07-a12912774161" (UID: "ff072e20-bb88-4bc8-8e07-a12912774161"). InnerVolumeSpecName "kube-api-access-hxgmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.177370 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff072e20-bb88-4bc8-8e07-a12912774161-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ff072e20-bb88-4bc8-8e07-a12912774161" (UID: "ff072e20-bb88-4bc8-8e07-a12912774161"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.177890 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff072e20-bb88-4bc8-8e07-a12912774161-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "ff072e20-bb88-4bc8-8e07-a12912774161" (UID: "ff072e20-bb88-4bc8-8e07-a12912774161"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.254884 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxgmr\" (UniqueName: \"kubernetes.io/projected/ff072e20-bb88-4bc8-8e07-a12912774161-kube-api-access-hxgmr\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.254923 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff072e20-bb88-4bc8-8e07-a12912774161-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.254936 4693 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ff072e20-bb88-4bc8-8e07-a12912774161-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.627631 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-tkkcn" event={"ID":"ff072e20-bb88-4bc8-8e07-a12912774161","Type":"ContainerDied","Data":"90bcbcf2264e42897944bc9a46c10fb8856662bb9c3fe0eb49f9d3f6a9a9b72c"} Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.627951 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90bcbcf2264e42897944bc9a46c10fb8856662bb9c3fe0eb49f9d3f6a9a9b72c" Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.627716 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-tkkcn" Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.818778 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w"] Dec 04 10:20:30 crc kubenswrapper[4693]: E1204 10:20:30.821171 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff072e20-bb88-4bc8-8e07-a12912774161" containerName="ssh-known-hosts-edpm-deployment" Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.821264 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff072e20-bb88-4bc8-8e07-a12912774161" containerName="ssh-known-hosts-edpm-deployment" Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.821742 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff072e20-bb88-4bc8-8e07-a12912774161" containerName="ssh-known-hosts-edpm-deployment" Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.822545 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w" Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.824358 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.824666 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzd9z" Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.825631 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.825996 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.832675 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w"] Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.968181 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a57888c7-06f9-478c-9e80-3c028cabcb28-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5wn5w\" (UID: \"a57888c7-06f9-478c-9e80-3c028cabcb28\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w" Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.968222 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c5k2\" (UniqueName: \"kubernetes.io/projected/a57888c7-06f9-478c-9e80-3c028cabcb28-kube-api-access-7c5k2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5wn5w\" (UID: \"a57888c7-06f9-478c-9e80-3c028cabcb28\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w" Dec 04 10:20:30 crc kubenswrapper[4693]: I1204 10:20:30.968439 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a57888c7-06f9-478c-9e80-3c028cabcb28-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5wn5w\" (UID: \"a57888c7-06f9-478c-9e80-3c028cabcb28\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w" Dec 04 10:20:31 crc kubenswrapper[4693]: I1204 10:20:31.070907 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a57888c7-06f9-478c-9e80-3c028cabcb28-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5wn5w\" (UID: \"a57888c7-06f9-478c-9e80-3c028cabcb28\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w" Dec 04 10:20:31 crc kubenswrapper[4693]: I1204 10:20:31.071098 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a57888c7-06f9-478c-9e80-3c028cabcb28-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5wn5w\" (UID: \"a57888c7-06f9-478c-9e80-3c028cabcb28\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w" Dec 04 10:20:31 crc kubenswrapper[4693]: I1204 10:20:31.071129 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c5k2\" (UniqueName: \"kubernetes.io/projected/a57888c7-06f9-478c-9e80-3c028cabcb28-kube-api-access-7c5k2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5wn5w\" (UID: \"a57888c7-06f9-478c-9e80-3c028cabcb28\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w" Dec 04 10:20:31 crc kubenswrapper[4693]: I1204 10:20:31.076069 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a57888c7-06f9-478c-9e80-3c028cabcb28-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5wn5w\" (UID: \"a57888c7-06f9-478c-9e80-3c028cabcb28\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w" Dec 04 10:20:31 crc kubenswrapper[4693]: I1204 10:20:31.081845 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a57888c7-06f9-478c-9e80-3c028cabcb28-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5wn5w\" (UID: \"a57888c7-06f9-478c-9e80-3c028cabcb28\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w" Dec 04 10:20:31 crc kubenswrapper[4693]: I1204 10:20:31.088977 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c5k2\" (UniqueName: \"kubernetes.io/projected/a57888c7-06f9-478c-9e80-3c028cabcb28-kube-api-access-7c5k2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5wn5w\" (UID: \"a57888c7-06f9-478c-9e80-3c028cabcb28\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w" Dec 04 10:20:31 crc kubenswrapper[4693]: I1204 10:20:31.142352 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w" Dec 04 10:20:31 crc kubenswrapper[4693]: I1204 10:20:31.636676 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w"] Dec 04 10:20:32 crc kubenswrapper[4693]: I1204 10:20:32.648705 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w" event={"ID":"a57888c7-06f9-478c-9e80-3c028cabcb28","Type":"ContainerStarted","Data":"521b436633c642fb1b9c5172668f0100e583c24c426b3bd3e219fb0761edb142"} Dec 04 10:20:32 crc kubenswrapper[4693]: I1204 10:20:32.649048 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w" event={"ID":"a57888c7-06f9-478c-9e80-3c028cabcb28","Type":"ContainerStarted","Data":"0954f59f32e500e696e3b053145eeb1f125ff7768e76cc2769ef4e5c360f681c"} Dec 04 10:20:32 crc kubenswrapper[4693]: I1204 10:20:32.678480 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w" podStartSLOduration=2.190039436 podStartE2EDuration="2.678452395s" podCreationTimestamp="2025-12-04 10:20:30 +0000 UTC" firstStartedPulling="2025-12-04 10:20:31.645688681 +0000 UTC m=+2277.543282434" lastFinishedPulling="2025-12-04 10:20:32.13410163 +0000 UTC m=+2278.031695393" observedRunningTime="2025-12-04 10:20:32.663549532 +0000 UTC m=+2278.561143295" watchObservedRunningTime="2025-12-04 10:20:32.678452395 +0000 UTC m=+2278.576046148" Dec 04 10:20:40 crc kubenswrapper[4693]: I1204 10:20:40.717911 4693 generic.go:334] "Generic (PLEG): container finished" podID="a57888c7-06f9-478c-9e80-3c028cabcb28" containerID="521b436633c642fb1b9c5172668f0100e583c24c426b3bd3e219fb0761edb142" exitCode=0 Dec 04 10:20:40 crc kubenswrapper[4693]: I1204 10:20:40.718041 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w" event={"ID":"a57888c7-06f9-478c-9e80-3c028cabcb28","Type":"ContainerDied","Data":"521b436633c642fb1b9c5172668f0100e583c24c426b3bd3e219fb0761edb142"} Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.193439 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w" Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.312673 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a57888c7-06f9-478c-9e80-3c028cabcb28-ssh-key\") pod \"a57888c7-06f9-478c-9e80-3c028cabcb28\" (UID: \"a57888c7-06f9-478c-9e80-3c028cabcb28\") " Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.312781 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c5k2\" (UniqueName: \"kubernetes.io/projected/a57888c7-06f9-478c-9e80-3c028cabcb28-kube-api-access-7c5k2\") pod \"a57888c7-06f9-478c-9e80-3c028cabcb28\" (UID: \"a57888c7-06f9-478c-9e80-3c028cabcb28\") " Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.312815 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a57888c7-06f9-478c-9e80-3c028cabcb28-inventory\") pod \"a57888c7-06f9-478c-9e80-3c028cabcb28\" (UID: \"a57888c7-06f9-478c-9e80-3c028cabcb28\") " Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.319286 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a57888c7-06f9-478c-9e80-3c028cabcb28-kube-api-access-7c5k2" (OuterVolumeSpecName: "kube-api-access-7c5k2") pod "a57888c7-06f9-478c-9e80-3c028cabcb28" (UID: "a57888c7-06f9-478c-9e80-3c028cabcb28"). InnerVolumeSpecName "kube-api-access-7c5k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.346072 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a57888c7-06f9-478c-9e80-3c028cabcb28-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a57888c7-06f9-478c-9e80-3c028cabcb28" (UID: "a57888c7-06f9-478c-9e80-3c028cabcb28"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.365873 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a57888c7-06f9-478c-9e80-3c028cabcb28-inventory" (OuterVolumeSpecName: "inventory") pod "a57888c7-06f9-478c-9e80-3c028cabcb28" (UID: "a57888c7-06f9-478c-9e80-3c028cabcb28"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.414841 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a57888c7-06f9-478c-9e80-3c028cabcb28-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.414884 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c5k2\" (UniqueName: \"kubernetes.io/projected/a57888c7-06f9-478c-9e80-3c028cabcb28-kube-api-access-7c5k2\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.414896 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a57888c7-06f9-478c-9e80-3c028cabcb28-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.735822 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w" event={"ID":"a57888c7-06f9-478c-9e80-3c028cabcb28","Type":"ContainerDied","Data":"0954f59f32e500e696e3b053145eeb1f125ff7768e76cc2769ef4e5c360f681c"} Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.735864 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0954f59f32e500e696e3b053145eeb1f125ff7768e76cc2769ef4e5c360f681c" Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.736169 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wn5w" Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.806121 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5"] Dec 04 10:20:42 crc kubenswrapper[4693]: E1204 10:20:42.806542 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57888c7-06f9-478c-9e80-3c028cabcb28" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.806562 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57888c7-06f9-478c-9e80-3c028cabcb28" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.806749 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a57888c7-06f9-478c-9e80-3c028cabcb28" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.808159 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5" Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.810454 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.810640 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzd9z" Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.810978 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.812110 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.819357 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5"] Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.925111 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db089c88-39e5-4e8f-93b8-b02a59f50b93-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5\" (UID: \"db089c88-39e5-4e8f-93b8-b02a59f50b93\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5" Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.925577 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db089c88-39e5-4e8f-93b8-b02a59f50b93-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5\" (UID: \"db089c88-39e5-4e8f-93b8-b02a59f50b93\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5" Dec 04 10:20:42 crc kubenswrapper[4693]: I1204 10:20:42.925665 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg697\" (UniqueName: \"kubernetes.io/projected/db089c88-39e5-4e8f-93b8-b02a59f50b93-kube-api-access-gg697\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5\" (UID: \"db089c88-39e5-4e8f-93b8-b02a59f50b93\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5" Dec 04 10:20:43 crc kubenswrapper[4693]: I1204 10:20:43.027313 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db089c88-39e5-4e8f-93b8-b02a59f50b93-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5\" (UID: \"db089c88-39e5-4e8f-93b8-b02a59f50b93\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5" Dec 04 10:20:43 crc kubenswrapper[4693]: I1204 10:20:43.027417 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db089c88-39e5-4e8f-93b8-b02a59f50b93-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5\" (UID: \"db089c88-39e5-4e8f-93b8-b02a59f50b93\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5" Dec 04 10:20:43 crc kubenswrapper[4693]: I1204 10:20:43.027492 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg697\" (UniqueName: \"kubernetes.io/projected/db089c88-39e5-4e8f-93b8-b02a59f50b93-kube-api-access-gg697\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5\" (UID: \"db089c88-39e5-4e8f-93b8-b02a59f50b93\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5" Dec 04 10:20:43 crc kubenswrapper[4693]: I1204 10:20:43.032592 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db089c88-39e5-4e8f-93b8-b02a59f50b93-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5\" (UID: \"db089c88-39e5-4e8f-93b8-b02a59f50b93\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5" Dec 04 10:20:43 crc kubenswrapper[4693]: I1204 10:20:43.033934 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db089c88-39e5-4e8f-93b8-b02a59f50b93-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5\" (UID: \"db089c88-39e5-4e8f-93b8-b02a59f50b93\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5" Dec 04 10:20:43 crc kubenswrapper[4693]: I1204 10:20:43.044685 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg697\" (UniqueName: \"kubernetes.io/projected/db089c88-39e5-4e8f-93b8-b02a59f50b93-kube-api-access-gg697\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5\" (UID: \"db089c88-39e5-4e8f-93b8-b02a59f50b93\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5" Dec 04 10:20:43 crc kubenswrapper[4693]: I1204 10:20:43.129477 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5" Dec 04 10:20:43 crc kubenswrapper[4693]: I1204 10:20:43.651855 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5"] Dec 04 10:20:43 crc kubenswrapper[4693]: I1204 10:20:43.745497 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5" event={"ID":"db089c88-39e5-4e8f-93b8-b02a59f50b93","Type":"ContainerStarted","Data":"1a93634dc988608e045459831c455dcfbcc97243fd5eea2444637c5fbfdd38b2"} Dec 04 10:20:46 crc kubenswrapper[4693]: I1204 10:20:46.770439 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5" event={"ID":"db089c88-39e5-4e8f-93b8-b02a59f50b93","Type":"ContainerStarted","Data":"c2fe891dd68590ace2274548f426eb837eb78d3ad9a22164bb5d486f01e607d0"} Dec 04 10:20:46 crc kubenswrapper[4693]: I1204 10:20:46.788581 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5" podStartSLOduration=2.242039618 podStartE2EDuration="4.788563602s" podCreationTimestamp="2025-12-04 10:20:42 +0000 UTC" firstStartedPulling="2025-12-04 10:20:43.652722319 +0000 UTC m=+2289.550316072" lastFinishedPulling="2025-12-04 10:20:46.199246303 +0000 UTC m=+2292.096840056" observedRunningTime="2025-12-04 10:20:46.78265909 +0000 UTC m=+2292.680252843" watchObservedRunningTime="2025-12-04 10:20:46.788563602 +0000 UTC m=+2292.686157355" Dec 04 10:20:55 crc kubenswrapper[4693]: I1204 10:20:55.842778 4693 generic.go:334] "Generic (PLEG): container finished" podID="db089c88-39e5-4e8f-93b8-b02a59f50b93" containerID="c2fe891dd68590ace2274548f426eb837eb78d3ad9a22164bb5d486f01e607d0" exitCode=0 Dec 04 10:20:55 crc kubenswrapper[4693]: I1204 10:20:55.842886 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5" event={"ID":"db089c88-39e5-4e8f-93b8-b02a59f50b93","Type":"ContainerDied","Data":"c2fe891dd68590ace2274548f426eb837eb78d3ad9a22164bb5d486f01e607d0"} Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.278629 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.417268 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db089c88-39e5-4e8f-93b8-b02a59f50b93-inventory\") pod \"db089c88-39e5-4e8f-93b8-b02a59f50b93\" (UID: \"db089c88-39e5-4e8f-93b8-b02a59f50b93\") " Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.417667 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg697\" (UniqueName: \"kubernetes.io/projected/db089c88-39e5-4e8f-93b8-b02a59f50b93-kube-api-access-gg697\") pod \"db089c88-39e5-4e8f-93b8-b02a59f50b93\" (UID: \"db089c88-39e5-4e8f-93b8-b02a59f50b93\") " Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.417735 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db089c88-39e5-4e8f-93b8-b02a59f50b93-ssh-key\") pod \"db089c88-39e5-4e8f-93b8-b02a59f50b93\" (UID: \"db089c88-39e5-4e8f-93b8-b02a59f50b93\") " Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.425705 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db089c88-39e5-4e8f-93b8-b02a59f50b93-kube-api-access-gg697" (OuterVolumeSpecName: "kube-api-access-gg697") pod "db089c88-39e5-4e8f-93b8-b02a59f50b93" (UID: "db089c88-39e5-4e8f-93b8-b02a59f50b93"). InnerVolumeSpecName "kube-api-access-gg697". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.468201 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db089c88-39e5-4e8f-93b8-b02a59f50b93-inventory" (OuterVolumeSpecName: "inventory") pod "db089c88-39e5-4e8f-93b8-b02a59f50b93" (UID: "db089c88-39e5-4e8f-93b8-b02a59f50b93"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.471557 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db089c88-39e5-4e8f-93b8-b02a59f50b93-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "db089c88-39e5-4e8f-93b8-b02a59f50b93" (UID: "db089c88-39e5-4e8f-93b8-b02a59f50b93"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.520684 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db089c88-39e5-4e8f-93b8-b02a59f50b93-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.520724 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg697\" (UniqueName: \"kubernetes.io/projected/db089c88-39e5-4e8f-93b8-b02a59f50b93-kube-api-access-gg697\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.520739 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db089c88-39e5-4e8f-93b8-b02a59f50b93-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.861451 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5" event={"ID":"db089c88-39e5-4e8f-93b8-b02a59f50b93","Type":"ContainerDied","Data":"1a93634dc988608e045459831c455dcfbcc97243fd5eea2444637c5fbfdd38b2"} Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.861470 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.862018 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a93634dc988608e045459831c455dcfbcc97243fd5eea2444637c5fbfdd38b2" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.956444 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l"] Dec 04 10:20:57 crc kubenswrapper[4693]: E1204 10:20:57.956887 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db089c88-39e5-4e8f-93b8-b02a59f50b93" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.956906 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="db089c88-39e5-4e8f-93b8-b02a59f50b93" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.957104 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="db089c88-39e5-4e8f-93b8-b02a59f50b93" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.957929 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.960259 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.961045 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.961276 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.961432 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzd9z" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.961053 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.962068 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.962194 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.964642 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 04 10:20:57 crc kubenswrapper[4693]: I1204 10:20:57.975286 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l"] Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.132046 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.132223 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.132299 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.132479 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.132533 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.132566 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.132620 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.132691 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.132731 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.132853 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.132935 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.133002 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.133067 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.133169 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsh46\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-kube-api-access-fsh46\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.234919 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.234991 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.235032 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.235081 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsh46\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-kube-api-access-fsh46\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.235138 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.235165 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.235182 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.235205 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.235223 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.235241 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.235264 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.235286 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.235307 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.235375 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.240624 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.240858 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.241332 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.241522 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.241826 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.242213 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.242385 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.242457 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.244088 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.244626 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.244653 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.250093 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.250600 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.252909 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsh46\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-kube-api-access-fsh46\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.278299 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.786871 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l"] Dec 04 10:20:58 crc kubenswrapper[4693]: I1204 10:20:58.870384 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" event={"ID":"8ba7c573-448f-438c-9999-ffe4e8c28f52","Type":"ContainerStarted","Data":"704a369778d62495fa4614015f485ec4481bf171f6312f15b67d926632bd93b3"} Dec 04 10:20:59 crc kubenswrapper[4693]: I1204 10:20:59.879747 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" event={"ID":"8ba7c573-448f-438c-9999-ffe4e8c28f52","Type":"ContainerStarted","Data":"ab35295230a2801ffb8d6656bf34e455ae035e321b7f0f64da44ff8084a784f8"} Dec 04 10:20:59 crc kubenswrapper[4693]: I1204 10:20:59.907926 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" podStartSLOduration=2.262976526 podStartE2EDuration="2.907906313s" podCreationTimestamp="2025-12-04 10:20:57 +0000 UTC" firstStartedPulling="2025-12-04 10:20:58.794543661 +0000 UTC m=+2304.692137414" lastFinishedPulling="2025-12-04 10:20:59.439473448 +0000 UTC m=+2305.337067201" observedRunningTime="2025-12-04 10:20:59.898272925 +0000 UTC m=+2305.795866698" watchObservedRunningTime="2025-12-04 10:20:59.907906313 +0000 UTC m=+2305.805500066" Dec 04 10:21:37 crc kubenswrapper[4693]: I1204 10:21:37.205955 4693 generic.go:334] "Generic (PLEG): container finished" podID="8ba7c573-448f-438c-9999-ffe4e8c28f52" containerID="ab35295230a2801ffb8d6656bf34e455ae035e321b7f0f64da44ff8084a784f8" exitCode=0 Dec 04 10:21:37 crc kubenswrapper[4693]: I1204 10:21:37.206054 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" event={"ID":"8ba7c573-448f-438c-9999-ffe4e8c28f52","Type":"ContainerDied","Data":"ab35295230a2801ffb8d6656bf34e455ae035e321b7f0f64da44ff8084a784f8"} Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.615475 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.751988 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-libvirt-combined-ca-bundle\") pod \"8ba7c573-448f-438c-9999-ffe4e8c28f52\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.752150 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-ovn-default-certs-0\") pod \"8ba7c573-448f-438c-9999-ffe4e8c28f52\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.752176 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"8ba7c573-448f-438c-9999-ffe4e8c28f52\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.752232 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-repo-setup-combined-ca-bundle\") pod \"8ba7c573-448f-438c-9999-ffe4e8c28f52\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.752277 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-inventory\") pod \"8ba7c573-448f-438c-9999-ffe4e8c28f52\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.752344 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-bootstrap-combined-ca-bundle\") pod \"8ba7c573-448f-438c-9999-ffe4e8c28f52\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.752377 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-telemetry-combined-ca-bundle\") pod \"8ba7c573-448f-438c-9999-ffe4e8c28f52\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.753537 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"8ba7c573-448f-438c-9999-ffe4e8c28f52\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.753578 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"8ba7c573-448f-438c-9999-ffe4e8c28f52\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.753616 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-ovn-combined-ca-bundle\") pod \"8ba7c573-448f-438c-9999-ffe4e8c28f52\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.753665 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-neutron-metadata-combined-ca-bundle\") pod \"8ba7c573-448f-438c-9999-ffe4e8c28f52\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.753718 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-ssh-key\") pod \"8ba7c573-448f-438c-9999-ffe4e8c28f52\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.753769 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsh46\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-kube-api-access-fsh46\") pod \"8ba7c573-448f-438c-9999-ffe4e8c28f52\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.753798 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-nova-combined-ca-bundle\") pod \"8ba7c573-448f-438c-9999-ffe4e8c28f52\" (UID: \"8ba7c573-448f-438c-9999-ffe4e8c28f52\") " Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.758699 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "8ba7c573-448f-438c-9999-ffe4e8c28f52" (UID: "8ba7c573-448f-438c-9999-ffe4e8c28f52"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.759129 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "8ba7c573-448f-438c-9999-ffe4e8c28f52" (UID: "8ba7c573-448f-438c-9999-ffe4e8c28f52"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.759917 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8ba7c573-448f-438c-9999-ffe4e8c28f52" (UID: "8ba7c573-448f-438c-9999-ffe4e8c28f52"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.759938 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8ba7c573-448f-438c-9999-ffe4e8c28f52" (UID: "8ba7c573-448f-438c-9999-ffe4e8c28f52"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.763546 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "8ba7c573-448f-438c-9999-ffe4e8c28f52" (UID: "8ba7c573-448f-438c-9999-ffe4e8c28f52"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.764646 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "8ba7c573-448f-438c-9999-ffe4e8c28f52" (UID: "8ba7c573-448f-438c-9999-ffe4e8c28f52"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.771501 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "8ba7c573-448f-438c-9999-ffe4e8c28f52" (UID: "8ba7c573-448f-438c-9999-ffe4e8c28f52"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.772445 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "8ba7c573-448f-438c-9999-ffe4e8c28f52" (UID: "8ba7c573-448f-438c-9999-ffe4e8c28f52"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.772775 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "8ba7c573-448f-438c-9999-ffe4e8c28f52" (UID: "8ba7c573-448f-438c-9999-ffe4e8c28f52"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.772858 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "8ba7c573-448f-438c-9999-ffe4e8c28f52" (UID: "8ba7c573-448f-438c-9999-ffe4e8c28f52"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.773561 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "8ba7c573-448f-438c-9999-ffe4e8c28f52" (UID: "8ba7c573-448f-438c-9999-ffe4e8c28f52"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.782844 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-kube-api-access-fsh46" (OuterVolumeSpecName: "kube-api-access-fsh46") pod "8ba7c573-448f-438c-9999-ffe4e8c28f52" (UID: "8ba7c573-448f-438c-9999-ffe4e8c28f52"). InnerVolumeSpecName "kube-api-access-fsh46". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.791448 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8ba7c573-448f-438c-9999-ffe4e8c28f52" (UID: "8ba7c573-448f-438c-9999-ffe4e8c28f52"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.795860 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-inventory" (OuterVolumeSpecName: "inventory") pod "8ba7c573-448f-438c-9999-ffe4e8c28f52" (UID: "8ba7c573-448f-438c-9999-ffe4e8c28f52"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.856139 4693 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.856188 4693 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.856205 4693 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.856220 4693 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.856236 4693 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.856248 4693 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.856263 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.856278 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsh46\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-kube-api-access-fsh46\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.856290 4693 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.856301 4693 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.856313 4693 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.856325 4693 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8ba7c573-448f-438c-9999-ffe4e8c28f52-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.856360 4693 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:38 crc kubenswrapper[4693]: I1204 10:21:38.856372 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ba7c573-448f-438c-9999-ffe4e8c28f52-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.228940 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" event={"ID":"8ba7c573-448f-438c-9999-ffe4e8c28f52","Type":"ContainerDied","Data":"704a369778d62495fa4614015f485ec4481bf171f6312f15b67d926632bd93b3"} Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.229272 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="704a369778d62495fa4614015f485ec4481bf171f6312f15b67d926632bd93b3" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.228995 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.405967 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh"] Dec 04 10:21:39 crc kubenswrapper[4693]: E1204 10:21:39.406435 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba7c573-448f-438c-9999-ffe4e8c28f52" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.406458 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba7c573-448f-438c-9999-ffe4e8c28f52" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.406684 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba7c573-448f-438c-9999-ffe4e8c28f52" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.408214 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.412285 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.412431 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.412489 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.412541 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzd9z" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.413533 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.420176 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh"] Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.577004 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ab7d9721-bbc8-489e-96de-98ce148725de-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wkvzh\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.577278 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab7d9721-bbc8-489e-96de-98ce148725de-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wkvzh\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.577573 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab7d9721-bbc8-489e-96de-98ce148725de-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wkvzh\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.577752 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj7kx\" (UniqueName: \"kubernetes.io/projected/ab7d9721-bbc8-489e-96de-98ce148725de-kube-api-access-xj7kx\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wkvzh\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.577804 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7d9721-bbc8-489e-96de-98ce148725de-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wkvzh\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.679602 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj7kx\" (UniqueName: \"kubernetes.io/projected/ab7d9721-bbc8-489e-96de-98ce148725de-kube-api-access-xj7kx\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wkvzh\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.679674 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7d9721-bbc8-489e-96de-98ce148725de-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wkvzh\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.679722 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ab7d9721-bbc8-489e-96de-98ce148725de-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wkvzh\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.679755 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab7d9721-bbc8-489e-96de-98ce148725de-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wkvzh\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.679831 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab7d9721-bbc8-489e-96de-98ce148725de-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wkvzh\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.680976 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ab7d9721-bbc8-489e-96de-98ce148725de-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wkvzh\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.684028 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab7d9721-bbc8-489e-96de-98ce148725de-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wkvzh\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.684154 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab7d9721-bbc8-489e-96de-98ce148725de-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wkvzh\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.684639 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7d9721-bbc8-489e-96de-98ce148725de-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wkvzh\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.697371 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj7kx\" (UniqueName: \"kubernetes.io/projected/ab7d9721-bbc8-489e-96de-98ce148725de-kube-api-access-xj7kx\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wkvzh\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" Dec 04 10:21:39 crc kubenswrapper[4693]: I1204 10:21:39.789722 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" Dec 04 10:21:40 crc kubenswrapper[4693]: I1204 10:21:40.328778 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh"] Dec 04 10:21:41 crc kubenswrapper[4693]: I1204 10:21:41.249681 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" event={"ID":"ab7d9721-bbc8-489e-96de-98ce148725de","Type":"ContainerStarted","Data":"9ca0dfbf4eee4162151def548ed8464ec6462235215e819eabde956fe1ab6715"} Dec 04 10:21:41 crc kubenswrapper[4693]: I1204 10:21:41.250219 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" event={"ID":"ab7d9721-bbc8-489e-96de-98ce148725de","Type":"ContainerStarted","Data":"e3f57361a6134ddbc1b0453500702c06d3dd5ebeddc17d64f9f0e8ec909c4f51"} Dec 04 10:21:52 crc kubenswrapper[4693]: I1204 10:21:52.273020 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:21:52 crc kubenswrapper[4693]: I1204 10:21:52.273669 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:22:02 crc kubenswrapper[4693]: I1204 10:22:02.490104 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" podStartSLOduration=23.080781343 podStartE2EDuration="23.49008562s" podCreationTimestamp="2025-12-04 10:21:39 +0000 UTC" firstStartedPulling="2025-12-04 10:21:40.333145546 +0000 UTC m=+2346.230739309" lastFinishedPulling="2025-12-04 10:21:40.742449833 +0000 UTC m=+2346.640043586" observedRunningTime="2025-12-04 10:21:41.274152513 +0000 UTC m=+2347.171746306" watchObservedRunningTime="2025-12-04 10:22:02.49008562 +0000 UTC m=+2368.387679373" Dec 04 10:22:02 crc kubenswrapper[4693]: I1204 10:22:02.497770 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5jpfw"] Dec 04 10:22:02 crc kubenswrapper[4693]: I1204 10:22:02.502740 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jpfw" Dec 04 10:22:02 crc kubenswrapper[4693]: I1204 10:22:02.510691 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jpfw"] Dec 04 10:22:02 crc kubenswrapper[4693]: I1204 10:22:02.642942 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74mjn\" (UniqueName: \"kubernetes.io/projected/27bb2faf-03e6-42f6-8978-2205e7003346-kube-api-access-74mjn\") pod \"redhat-marketplace-5jpfw\" (UID: \"27bb2faf-03e6-42f6-8978-2205e7003346\") " pod="openshift-marketplace/redhat-marketplace-5jpfw" Dec 04 10:22:02 crc kubenswrapper[4693]: I1204 10:22:02.643018 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27bb2faf-03e6-42f6-8978-2205e7003346-catalog-content\") pod \"redhat-marketplace-5jpfw\" (UID: \"27bb2faf-03e6-42f6-8978-2205e7003346\") " pod="openshift-marketplace/redhat-marketplace-5jpfw" Dec 04 10:22:02 crc kubenswrapper[4693]: I1204 10:22:02.643193 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27bb2faf-03e6-42f6-8978-2205e7003346-utilities\") pod \"redhat-marketplace-5jpfw\" (UID: \"27bb2faf-03e6-42f6-8978-2205e7003346\") " pod="openshift-marketplace/redhat-marketplace-5jpfw" Dec 04 10:22:02 crc kubenswrapper[4693]: I1204 10:22:02.745456 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74mjn\" (UniqueName: \"kubernetes.io/projected/27bb2faf-03e6-42f6-8978-2205e7003346-kube-api-access-74mjn\") pod \"redhat-marketplace-5jpfw\" (UID: \"27bb2faf-03e6-42f6-8978-2205e7003346\") " pod="openshift-marketplace/redhat-marketplace-5jpfw" Dec 04 10:22:02 crc kubenswrapper[4693]: I1204 10:22:02.745525 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27bb2faf-03e6-42f6-8978-2205e7003346-catalog-content\") pod \"redhat-marketplace-5jpfw\" (UID: \"27bb2faf-03e6-42f6-8978-2205e7003346\") " pod="openshift-marketplace/redhat-marketplace-5jpfw" Dec 04 10:22:02 crc kubenswrapper[4693]: I1204 10:22:02.745630 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27bb2faf-03e6-42f6-8978-2205e7003346-utilities\") pod \"redhat-marketplace-5jpfw\" (UID: \"27bb2faf-03e6-42f6-8978-2205e7003346\") " pod="openshift-marketplace/redhat-marketplace-5jpfw" Dec 04 10:22:02 crc kubenswrapper[4693]: I1204 10:22:02.746160 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27bb2faf-03e6-42f6-8978-2205e7003346-catalog-content\") pod \"redhat-marketplace-5jpfw\" (UID: \"27bb2faf-03e6-42f6-8978-2205e7003346\") " pod="openshift-marketplace/redhat-marketplace-5jpfw" Dec 04 10:22:02 crc kubenswrapper[4693]: I1204 10:22:02.746362 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27bb2faf-03e6-42f6-8978-2205e7003346-utilities\") pod \"redhat-marketplace-5jpfw\" (UID: \"27bb2faf-03e6-42f6-8978-2205e7003346\") " pod="openshift-marketplace/redhat-marketplace-5jpfw" Dec 04 10:22:02 crc kubenswrapper[4693]: I1204 10:22:02.772196 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74mjn\" (UniqueName: \"kubernetes.io/projected/27bb2faf-03e6-42f6-8978-2205e7003346-kube-api-access-74mjn\") pod \"redhat-marketplace-5jpfw\" (UID: \"27bb2faf-03e6-42f6-8978-2205e7003346\") " pod="openshift-marketplace/redhat-marketplace-5jpfw" Dec 04 10:22:02 crc kubenswrapper[4693]: I1204 10:22:02.838633 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jpfw" Dec 04 10:22:03 crc kubenswrapper[4693]: I1204 10:22:03.305444 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jpfw"] Dec 04 10:22:03 crc kubenswrapper[4693]: I1204 10:22:03.438056 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpfw" event={"ID":"27bb2faf-03e6-42f6-8978-2205e7003346","Type":"ContainerStarted","Data":"8ca82613c4f7efc8f16ed91ff766b6e5ae8afa40378d2a8cbe4a07da7d569c11"} Dec 04 10:22:04 crc kubenswrapper[4693]: I1204 10:22:04.448020 4693 generic.go:334] "Generic (PLEG): container finished" podID="27bb2faf-03e6-42f6-8978-2205e7003346" containerID="83d92d783baa81bac02c3f783f6a97761d4bff988073dd1abc065f4cd6bcf022" exitCode=0 Dec 04 10:22:04 crc kubenswrapper[4693]: I1204 10:22:04.448123 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpfw" event={"ID":"27bb2faf-03e6-42f6-8978-2205e7003346","Type":"ContainerDied","Data":"83d92d783baa81bac02c3f783f6a97761d4bff988073dd1abc065f4cd6bcf022"} Dec 04 10:22:04 crc kubenswrapper[4693]: I1204 10:22:04.450262 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:22:06 crc kubenswrapper[4693]: I1204 10:22:06.465532 4693 generic.go:334] "Generic (PLEG): container finished" podID="27bb2faf-03e6-42f6-8978-2205e7003346" containerID="6aac49171de85c93c54b0e2349fb3cfc331087e9f090c1a8f9f8f81fcfc108c0" exitCode=0 Dec 04 10:22:06 crc kubenswrapper[4693]: I1204 10:22:06.478484 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpfw" event={"ID":"27bb2faf-03e6-42f6-8978-2205e7003346","Type":"ContainerDied","Data":"6aac49171de85c93c54b0e2349fb3cfc331087e9f090c1a8f9f8f81fcfc108c0"} Dec 04 10:22:07 crc kubenswrapper[4693]: I1204 10:22:07.478401 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpfw" event={"ID":"27bb2faf-03e6-42f6-8978-2205e7003346","Type":"ContainerStarted","Data":"762cb4153be1271522d4dc208b983eb8f93855b04d08c5cf3c53f9c8804aab67"} Dec 04 10:22:07 crc kubenswrapper[4693]: I1204 10:22:07.503973 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5jpfw" podStartSLOduration=2.850377752 podStartE2EDuration="5.503953942s" podCreationTimestamp="2025-12-04 10:22:02 +0000 UTC" firstStartedPulling="2025-12-04 10:22:04.449832043 +0000 UTC m=+2370.347425796" lastFinishedPulling="2025-12-04 10:22:07.103408233 +0000 UTC m=+2373.001001986" observedRunningTime="2025-12-04 10:22:07.496009506 +0000 UTC m=+2373.393603259" watchObservedRunningTime="2025-12-04 10:22:07.503953942 +0000 UTC m=+2373.401547695" Dec 04 10:22:12 crc kubenswrapper[4693]: I1204 10:22:12.838780 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5jpfw" Dec 04 10:22:12 crc kubenswrapper[4693]: I1204 10:22:12.840215 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5jpfw" Dec 04 10:22:12 crc kubenswrapper[4693]: I1204 10:22:12.882172 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5jpfw" Dec 04 10:22:13 crc kubenswrapper[4693]: I1204 10:22:13.575448 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5jpfw" Dec 04 10:22:13 crc kubenswrapper[4693]: I1204 10:22:13.627130 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jpfw"] Dec 04 10:22:15 crc kubenswrapper[4693]: I1204 10:22:15.550087 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5jpfw" podUID="27bb2faf-03e6-42f6-8978-2205e7003346" containerName="registry-server" containerID="cri-o://762cb4153be1271522d4dc208b983eb8f93855b04d08c5cf3c53f9c8804aab67" gracePeriod=2 Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.000636 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jpfw" Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.138371 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27bb2faf-03e6-42f6-8978-2205e7003346-utilities\") pod \"27bb2faf-03e6-42f6-8978-2205e7003346\" (UID: \"27bb2faf-03e6-42f6-8978-2205e7003346\") " Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.138444 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74mjn\" (UniqueName: \"kubernetes.io/projected/27bb2faf-03e6-42f6-8978-2205e7003346-kube-api-access-74mjn\") pod \"27bb2faf-03e6-42f6-8978-2205e7003346\" (UID: \"27bb2faf-03e6-42f6-8978-2205e7003346\") " Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.138660 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27bb2faf-03e6-42f6-8978-2205e7003346-catalog-content\") pod \"27bb2faf-03e6-42f6-8978-2205e7003346\" (UID: \"27bb2faf-03e6-42f6-8978-2205e7003346\") " Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.140193 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27bb2faf-03e6-42f6-8978-2205e7003346-utilities" (OuterVolumeSpecName: "utilities") pod "27bb2faf-03e6-42f6-8978-2205e7003346" (UID: "27bb2faf-03e6-42f6-8978-2205e7003346"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.145869 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27bb2faf-03e6-42f6-8978-2205e7003346-kube-api-access-74mjn" (OuterVolumeSpecName: "kube-api-access-74mjn") pod "27bb2faf-03e6-42f6-8978-2205e7003346" (UID: "27bb2faf-03e6-42f6-8978-2205e7003346"). InnerVolumeSpecName "kube-api-access-74mjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.160033 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27bb2faf-03e6-42f6-8978-2205e7003346-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27bb2faf-03e6-42f6-8978-2205e7003346" (UID: "27bb2faf-03e6-42f6-8978-2205e7003346"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.243745 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27bb2faf-03e6-42f6-8978-2205e7003346-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.244066 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27bb2faf-03e6-42f6-8978-2205e7003346-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.244084 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74mjn\" (UniqueName: \"kubernetes.io/projected/27bb2faf-03e6-42f6-8978-2205e7003346-kube-api-access-74mjn\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.561424 4693 generic.go:334] "Generic (PLEG): container finished" podID="27bb2faf-03e6-42f6-8978-2205e7003346" containerID="762cb4153be1271522d4dc208b983eb8f93855b04d08c5cf3c53f9c8804aab67" exitCode=0 Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.561477 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpfw" event={"ID":"27bb2faf-03e6-42f6-8978-2205e7003346","Type":"ContainerDied","Data":"762cb4153be1271522d4dc208b983eb8f93855b04d08c5cf3c53f9c8804aab67"} Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.561513 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpfw" event={"ID":"27bb2faf-03e6-42f6-8978-2205e7003346","Type":"ContainerDied","Data":"8ca82613c4f7efc8f16ed91ff766b6e5ae8afa40378d2a8cbe4a07da7d569c11"} Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.561535 4693 scope.go:117] "RemoveContainer" containerID="762cb4153be1271522d4dc208b983eb8f93855b04d08c5cf3c53f9c8804aab67" Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.561548 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jpfw" Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.596492 4693 scope.go:117] "RemoveContainer" containerID="6aac49171de85c93c54b0e2349fb3cfc331087e9f090c1a8f9f8f81fcfc108c0" Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.598032 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jpfw"] Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.611817 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jpfw"] Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.623076 4693 scope.go:117] "RemoveContainer" containerID="83d92d783baa81bac02c3f783f6a97761d4bff988073dd1abc065f4cd6bcf022" Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.663058 4693 scope.go:117] "RemoveContainer" containerID="762cb4153be1271522d4dc208b983eb8f93855b04d08c5cf3c53f9c8804aab67" Dec 04 10:22:16 crc kubenswrapper[4693]: E1204 10:22:16.663597 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"762cb4153be1271522d4dc208b983eb8f93855b04d08c5cf3c53f9c8804aab67\": container with ID starting with 762cb4153be1271522d4dc208b983eb8f93855b04d08c5cf3c53f9c8804aab67 not found: ID does not exist" containerID="762cb4153be1271522d4dc208b983eb8f93855b04d08c5cf3c53f9c8804aab67" Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.663747 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"762cb4153be1271522d4dc208b983eb8f93855b04d08c5cf3c53f9c8804aab67"} err="failed to get container status \"762cb4153be1271522d4dc208b983eb8f93855b04d08c5cf3c53f9c8804aab67\": rpc error: code = NotFound desc = could not find container \"762cb4153be1271522d4dc208b983eb8f93855b04d08c5cf3c53f9c8804aab67\": container with ID starting with 762cb4153be1271522d4dc208b983eb8f93855b04d08c5cf3c53f9c8804aab67 not found: ID does not exist" Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.663843 4693 scope.go:117] "RemoveContainer" containerID="6aac49171de85c93c54b0e2349fb3cfc331087e9f090c1a8f9f8f81fcfc108c0" Dec 04 10:22:16 crc kubenswrapper[4693]: E1204 10:22:16.664229 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aac49171de85c93c54b0e2349fb3cfc331087e9f090c1a8f9f8f81fcfc108c0\": container with ID starting with 6aac49171de85c93c54b0e2349fb3cfc331087e9f090c1a8f9f8f81fcfc108c0 not found: ID does not exist" containerID="6aac49171de85c93c54b0e2349fb3cfc331087e9f090c1a8f9f8f81fcfc108c0" Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.664271 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aac49171de85c93c54b0e2349fb3cfc331087e9f090c1a8f9f8f81fcfc108c0"} err="failed to get container status \"6aac49171de85c93c54b0e2349fb3cfc331087e9f090c1a8f9f8f81fcfc108c0\": rpc error: code = NotFound desc = could not find container \"6aac49171de85c93c54b0e2349fb3cfc331087e9f090c1a8f9f8f81fcfc108c0\": container with ID starting with 6aac49171de85c93c54b0e2349fb3cfc331087e9f090c1a8f9f8f81fcfc108c0 not found: ID does not exist" Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.664295 4693 scope.go:117] "RemoveContainer" containerID="83d92d783baa81bac02c3f783f6a97761d4bff988073dd1abc065f4cd6bcf022" Dec 04 10:22:16 crc kubenswrapper[4693]: E1204 10:22:16.664579 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83d92d783baa81bac02c3f783f6a97761d4bff988073dd1abc065f4cd6bcf022\": container with ID starting with 83d92d783baa81bac02c3f783f6a97761d4bff988073dd1abc065f4cd6bcf022 not found: ID does not exist" containerID="83d92d783baa81bac02c3f783f6a97761d4bff988073dd1abc065f4cd6bcf022" Dec 04 10:22:16 crc kubenswrapper[4693]: I1204 10:22:16.664670 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83d92d783baa81bac02c3f783f6a97761d4bff988073dd1abc065f4cd6bcf022"} err="failed to get container status \"83d92d783baa81bac02c3f783f6a97761d4bff988073dd1abc065f4cd6bcf022\": rpc error: code = NotFound desc = could not find container \"83d92d783baa81bac02c3f783f6a97761d4bff988073dd1abc065f4cd6bcf022\": container with ID starting with 83d92d783baa81bac02c3f783f6a97761d4bff988073dd1abc065f4cd6bcf022 not found: ID does not exist" Dec 04 10:22:18 crc kubenswrapper[4693]: I1204 10:22:18.472655 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27bb2faf-03e6-42f6-8978-2205e7003346" path="/var/lib/kubelet/pods/27bb2faf-03e6-42f6-8978-2205e7003346/volumes" Dec 04 10:22:22 crc kubenswrapper[4693]: I1204 10:22:22.273500 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:22:22 crc kubenswrapper[4693]: I1204 10:22:22.274102 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:22:41 crc kubenswrapper[4693]: I1204 10:22:41.846324 4693 generic.go:334] "Generic (PLEG): container finished" podID="ab7d9721-bbc8-489e-96de-98ce148725de" containerID="9ca0dfbf4eee4162151def548ed8464ec6462235215e819eabde956fe1ab6715" exitCode=0 Dec 04 10:22:41 crc kubenswrapper[4693]: I1204 10:22:41.846363 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" event={"ID":"ab7d9721-bbc8-489e-96de-98ce148725de","Type":"ContainerDied","Data":"9ca0dfbf4eee4162151def548ed8464ec6462235215e819eabde956fe1ab6715"} Dec 04 10:22:43 crc kubenswrapper[4693]: I1204 10:22:43.242308 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" Dec 04 10:22:43 crc kubenswrapper[4693]: I1204 10:22:43.405070 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ab7d9721-bbc8-489e-96de-98ce148725de-ovncontroller-config-0\") pod \"ab7d9721-bbc8-489e-96de-98ce148725de\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " Dec 04 10:22:43 crc kubenswrapper[4693]: I1204 10:22:43.405456 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab7d9721-bbc8-489e-96de-98ce148725de-inventory\") pod \"ab7d9721-bbc8-489e-96de-98ce148725de\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " Dec 04 10:22:43 crc kubenswrapper[4693]: I1204 10:22:43.405494 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7d9721-bbc8-489e-96de-98ce148725de-ovn-combined-ca-bundle\") pod \"ab7d9721-bbc8-489e-96de-98ce148725de\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " Dec 04 10:22:43 crc kubenswrapper[4693]: I1204 10:22:43.405540 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj7kx\" (UniqueName: \"kubernetes.io/projected/ab7d9721-bbc8-489e-96de-98ce148725de-kube-api-access-xj7kx\") pod \"ab7d9721-bbc8-489e-96de-98ce148725de\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " Dec 04 10:22:43 crc kubenswrapper[4693]: I1204 10:22:43.405687 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab7d9721-bbc8-489e-96de-98ce148725de-ssh-key\") pod \"ab7d9721-bbc8-489e-96de-98ce148725de\" (UID: \"ab7d9721-bbc8-489e-96de-98ce148725de\") " Dec 04 10:22:43 crc kubenswrapper[4693]: I1204 10:22:43.411676 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab7d9721-bbc8-489e-96de-98ce148725de-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "ab7d9721-bbc8-489e-96de-98ce148725de" (UID: "ab7d9721-bbc8-489e-96de-98ce148725de"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:22:43 crc kubenswrapper[4693]: I1204 10:22:43.412728 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab7d9721-bbc8-489e-96de-98ce148725de-kube-api-access-xj7kx" (OuterVolumeSpecName: "kube-api-access-xj7kx") pod "ab7d9721-bbc8-489e-96de-98ce148725de" (UID: "ab7d9721-bbc8-489e-96de-98ce148725de"). InnerVolumeSpecName "kube-api-access-xj7kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:22:43 crc kubenswrapper[4693]: I1204 10:22:43.433488 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab7d9721-bbc8-489e-96de-98ce148725de-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "ab7d9721-bbc8-489e-96de-98ce148725de" (UID: "ab7d9721-bbc8-489e-96de-98ce148725de"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:22:43 crc kubenswrapper[4693]: I1204 10:22:43.436721 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab7d9721-bbc8-489e-96de-98ce148725de-inventory" (OuterVolumeSpecName: "inventory") pod "ab7d9721-bbc8-489e-96de-98ce148725de" (UID: "ab7d9721-bbc8-489e-96de-98ce148725de"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:22:43 crc kubenswrapper[4693]: I1204 10:22:43.437195 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab7d9721-bbc8-489e-96de-98ce148725de-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ab7d9721-bbc8-489e-96de-98ce148725de" (UID: "ab7d9721-bbc8-489e-96de-98ce148725de"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:22:43 crc kubenswrapper[4693]: I1204 10:22:43.508663 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj7kx\" (UniqueName: \"kubernetes.io/projected/ab7d9721-bbc8-489e-96de-98ce148725de-kube-api-access-xj7kx\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:43 crc kubenswrapper[4693]: I1204 10:22:43.508802 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab7d9721-bbc8-489e-96de-98ce148725de-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:43 crc kubenswrapper[4693]: I1204 10:22:43.508863 4693 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ab7d9721-bbc8-489e-96de-98ce148725de-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:43 crc kubenswrapper[4693]: I1204 10:22:43.508882 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab7d9721-bbc8-489e-96de-98ce148725de-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:43 crc kubenswrapper[4693]: I1204 10:22:43.508892 4693 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7d9721-bbc8-489e-96de-98ce148725de-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:43 crc kubenswrapper[4693]: I1204 10:22:43.863272 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" event={"ID":"ab7d9721-bbc8-489e-96de-98ce148725de","Type":"ContainerDied","Data":"e3f57361a6134ddbc1b0453500702c06d3dd5ebeddc17d64f9f0e8ec909c4f51"} Dec 04 10:22:43 crc kubenswrapper[4693]: I1204 10:22:43.863322 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3f57361a6134ddbc1b0453500702c06d3dd5ebeddc17d64f9f0e8ec909c4f51" Dec 04 10:22:43 crc kubenswrapper[4693]: I1204 10:22:43.863345 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wkvzh" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.009677 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s"] Dec 04 10:22:44 crc kubenswrapper[4693]: E1204 10:22:44.010754 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27bb2faf-03e6-42f6-8978-2205e7003346" containerName="registry-server" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.010777 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="27bb2faf-03e6-42f6-8978-2205e7003346" containerName="registry-server" Dec 04 10:22:44 crc kubenswrapper[4693]: E1204 10:22:44.010821 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27bb2faf-03e6-42f6-8978-2205e7003346" containerName="extract-utilities" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.010830 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="27bb2faf-03e6-42f6-8978-2205e7003346" containerName="extract-utilities" Dec 04 10:22:44 crc kubenswrapper[4693]: E1204 10:22:44.010842 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7d9721-bbc8-489e-96de-98ce148725de" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.010851 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7d9721-bbc8-489e-96de-98ce148725de" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 04 10:22:44 crc kubenswrapper[4693]: E1204 10:22:44.010872 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27bb2faf-03e6-42f6-8978-2205e7003346" containerName="extract-content" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.010880 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="27bb2faf-03e6-42f6-8978-2205e7003346" containerName="extract-content" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.011482 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="27bb2faf-03e6-42f6-8978-2205e7003346" containerName="registry-server" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.011530 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab7d9721-bbc8-489e-96de-98ce148725de" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.012886 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.022422 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.023164 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.023462 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.023588 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.026374 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzd9z" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.035171 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.046529 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s"] Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.122707 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.122780 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.122971 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.123041 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.123577 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99f67\" (UniqueName: \"kubernetes.io/projected/cbd12578-e1a3-41b0-95be-2162e189daae-kube-api-access-99f67\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.123705 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.226009 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99f67\" (UniqueName: \"kubernetes.io/projected/cbd12578-e1a3-41b0-95be-2162e189daae-kube-api-access-99f67\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.226094 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.226173 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.226215 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.226279 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.226938 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.229969 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.230264 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.230427 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.230953 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.233994 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.242980 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99f67\" (UniqueName: \"kubernetes.io/projected/cbd12578-e1a3-41b0-95be-2162e189daae-kube-api-access-99f67\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.347284 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.831917 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s"] Dec 04 10:22:44 crc kubenswrapper[4693]: I1204 10:22:44.871718 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" event={"ID":"cbd12578-e1a3-41b0-95be-2162e189daae","Type":"ContainerStarted","Data":"9bdfc04a6adce82e72d3f441be744b4e06884927e1f52aaf75d3621dfe141b79"} Dec 04 10:22:45 crc kubenswrapper[4693]: I1204 10:22:45.885991 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" event={"ID":"cbd12578-e1a3-41b0-95be-2162e189daae","Type":"ContainerStarted","Data":"46e254990fb0897adfd389ff18d409debf45cf3f4813030d47aa943a1cafccfa"} Dec 04 10:22:45 crc kubenswrapper[4693]: I1204 10:22:45.905566 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" podStartSLOduration=2.258526101 podStartE2EDuration="2.905547537s" podCreationTimestamp="2025-12-04 10:22:43 +0000 UTC" firstStartedPulling="2025-12-04 10:22:44.839919016 +0000 UTC m=+2410.737512769" lastFinishedPulling="2025-12-04 10:22:45.486940452 +0000 UTC m=+2411.384534205" observedRunningTime="2025-12-04 10:22:45.904171291 +0000 UTC m=+2411.801765044" watchObservedRunningTime="2025-12-04 10:22:45.905547537 +0000 UTC m=+2411.803141290" Dec 04 10:22:52 crc kubenswrapper[4693]: I1204 10:22:52.272822 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:22:52 crc kubenswrapper[4693]: I1204 10:22:52.273390 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:22:52 crc kubenswrapper[4693]: I1204 10:22:52.273437 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 10:22:52 crc kubenswrapper[4693]: I1204 10:22:52.274150 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:22:52 crc kubenswrapper[4693]: I1204 10:22:52.274196 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" gracePeriod=600 Dec 04 10:22:52 crc kubenswrapper[4693]: E1204 10:22:52.390768 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:22:52 crc kubenswrapper[4693]: I1204 10:22:52.950380 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b"} Dec 04 10:22:52 crc kubenswrapper[4693]: I1204 10:22:52.950485 4693 scope.go:117] "RemoveContainer" containerID="cb78c8a8470bb98259f762ceb5868f195a5cc40bc0fad1b334b0581114dcc9f0" Dec 04 10:22:52 crc kubenswrapper[4693]: I1204 10:22:52.950298 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" exitCode=0 Dec 04 10:22:52 crc kubenswrapper[4693]: I1204 10:22:52.951479 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:22:52 crc kubenswrapper[4693]: E1204 10:22:52.952072 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:23:06 crc kubenswrapper[4693]: I1204 10:23:06.461422 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:23:06 crc kubenswrapper[4693]: E1204 10:23:06.462262 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:23:14 crc kubenswrapper[4693]: I1204 10:23:14.369938 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-64n4x"] Dec 04 10:23:14 crc kubenswrapper[4693]: I1204 10:23:14.372612 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64n4x" Dec 04 10:23:14 crc kubenswrapper[4693]: I1204 10:23:14.379747 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-64n4x"] Dec 04 10:23:14 crc kubenswrapper[4693]: I1204 10:23:14.503742 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlfn2\" (UniqueName: \"kubernetes.io/projected/91dbd7f5-576a-42ec-aeb8-a7514f435d7f-kube-api-access-hlfn2\") pod \"redhat-operators-64n4x\" (UID: \"91dbd7f5-576a-42ec-aeb8-a7514f435d7f\") " pod="openshift-marketplace/redhat-operators-64n4x" Dec 04 10:23:14 crc kubenswrapper[4693]: I1204 10:23:14.504293 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91dbd7f5-576a-42ec-aeb8-a7514f435d7f-catalog-content\") pod \"redhat-operators-64n4x\" (UID: \"91dbd7f5-576a-42ec-aeb8-a7514f435d7f\") " pod="openshift-marketplace/redhat-operators-64n4x" Dec 04 10:23:14 crc kubenswrapper[4693]: I1204 10:23:14.504489 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91dbd7f5-576a-42ec-aeb8-a7514f435d7f-utilities\") pod \"redhat-operators-64n4x\" (UID: \"91dbd7f5-576a-42ec-aeb8-a7514f435d7f\") " pod="openshift-marketplace/redhat-operators-64n4x" Dec 04 10:23:14 crc kubenswrapper[4693]: I1204 10:23:14.606684 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91dbd7f5-576a-42ec-aeb8-a7514f435d7f-catalog-content\") pod \"redhat-operators-64n4x\" (UID: \"91dbd7f5-576a-42ec-aeb8-a7514f435d7f\") " pod="openshift-marketplace/redhat-operators-64n4x" Dec 04 10:23:14 crc kubenswrapper[4693]: I1204 10:23:14.606839 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91dbd7f5-576a-42ec-aeb8-a7514f435d7f-utilities\") pod \"redhat-operators-64n4x\" (UID: \"91dbd7f5-576a-42ec-aeb8-a7514f435d7f\") " pod="openshift-marketplace/redhat-operators-64n4x" Dec 04 10:23:14 crc kubenswrapper[4693]: I1204 10:23:14.607081 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlfn2\" (UniqueName: \"kubernetes.io/projected/91dbd7f5-576a-42ec-aeb8-a7514f435d7f-kube-api-access-hlfn2\") pod \"redhat-operators-64n4x\" (UID: \"91dbd7f5-576a-42ec-aeb8-a7514f435d7f\") " pod="openshift-marketplace/redhat-operators-64n4x" Dec 04 10:23:14 crc kubenswrapper[4693]: I1204 10:23:14.607386 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91dbd7f5-576a-42ec-aeb8-a7514f435d7f-utilities\") pod \"redhat-operators-64n4x\" (UID: \"91dbd7f5-576a-42ec-aeb8-a7514f435d7f\") " pod="openshift-marketplace/redhat-operators-64n4x" Dec 04 10:23:14 crc kubenswrapper[4693]: I1204 10:23:14.607385 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91dbd7f5-576a-42ec-aeb8-a7514f435d7f-catalog-content\") pod \"redhat-operators-64n4x\" (UID: \"91dbd7f5-576a-42ec-aeb8-a7514f435d7f\") " pod="openshift-marketplace/redhat-operators-64n4x" Dec 04 10:23:14 crc kubenswrapper[4693]: I1204 10:23:14.640168 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlfn2\" (UniqueName: \"kubernetes.io/projected/91dbd7f5-576a-42ec-aeb8-a7514f435d7f-kube-api-access-hlfn2\") pod \"redhat-operators-64n4x\" (UID: \"91dbd7f5-576a-42ec-aeb8-a7514f435d7f\") " pod="openshift-marketplace/redhat-operators-64n4x" Dec 04 10:23:14 crc kubenswrapper[4693]: I1204 10:23:14.695563 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64n4x" Dec 04 10:23:15 crc kubenswrapper[4693]: I1204 10:23:15.208622 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-64n4x"] Dec 04 10:23:16 crc kubenswrapper[4693]: I1204 10:23:16.151907 4693 generic.go:334] "Generic (PLEG): container finished" podID="91dbd7f5-576a-42ec-aeb8-a7514f435d7f" containerID="31e4f136b3414b67532b9061e1dc8894c3d90bb7d84f601821d1369f51152f63" exitCode=0 Dec 04 10:23:16 crc kubenswrapper[4693]: I1204 10:23:16.152122 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64n4x" event={"ID":"91dbd7f5-576a-42ec-aeb8-a7514f435d7f","Type":"ContainerDied","Data":"31e4f136b3414b67532b9061e1dc8894c3d90bb7d84f601821d1369f51152f63"} Dec 04 10:23:16 crc kubenswrapper[4693]: I1204 10:23:16.152283 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64n4x" event={"ID":"91dbd7f5-576a-42ec-aeb8-a7514f435d7f","Type":"ContainerStarted","Data":"bb083d51b03b1363341cc4ce9eca91e04974ee9c681ddc058e27d8cd5a45bfac"} Dec 04 10:23:18 crc kubenswrapper[4693]: I1204 10:23:18.173213 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64n4x" event={"ID":"91dbd7f5-576a-42ec-aeb8-a7514f435d7f","Type":"ContainerStarted","Data":"3f3533bb37b27fef4c2ff466575ac45978a02889d83cce6d000bcc9047792f35"} Dec 04 10:23:20 crc kubenswrapper[4693]: I1204 10:23:20.191808 4693 generic.go:334] "Generic (PLEG): container finished" podID="91dbd7f5-576a-42ec-aeb8-a7514f435d7f" containerID="3f3533bb37b27fef4c2ff466575ac45978a02889d83cce6d000bcc9047792f35" exitCode=0 Dec 04 10:23:20 crc kubenswrapper[4693]: I1204 10:23:20.191887 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64n4x" event={"ID":"91dbd7f5-576a-42ec-aeb8-a7514f435d7f","Type":"ContainerDied","Data":"3f3533bb37b27fef4c2ff466575ac45978a02889d83cce6d000bcc9047792f35"} Dec 04 10:23:20 crc kubenswrapper[4693]: I1204 10:23:20.460865 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:23:20 crc kubenswrapper[4693]: E1204 10:23:20.461163 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:23:23 crc kubenswrapper[4693]: I1204 10:23:23.223508 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64n4x" event={"ID":"91dbd7f5-576a-42ec-aeb8-a7514f435d7f","Type":"ContainerStarted","Data":"4e0d368351024f0468a7ebd7765f02957c69709da36f48a784aa1d781edd20d9"} Dec 04 10:23:23 crc kubenswrapper[4693]: I1204 10:23:23.248857 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-64n4x" podStartSLOduration=4.748875762 podStartE2EDuration="9.248835079s" podCreationTimestamp="2025-12-04 10:23:14 +0000 UTC" firstStartedPulling="2025-12-04 10:23:16.154684204 +0000 UTC m=+2442.052277957" lastFinishedPulling="2025-12-04 10:23:20.654643521 +0000 UTC m=+2446.552237274" observedRunningTime="2025-12-04 10:23:23.240774186 +0000 UTC m=+2449.138367939" watchObservedRunningTime="2025-12-04 10:23:23.248835079 +0000 UTC m=+2449.146428832" Dec 04 10:23:24 crc kubenswrapper[4693]: I1204 10:23:24.695853 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-64n4x" Dec 04 10:23:24 crc kubenswrapper[4693]: I1204 10:23:24.695993 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-64n4x" Dec 04 10:23:25 crc kubenswrapper[4693]: I1204 10:23:25.742175 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-64n4x" podUID="91dbd7f5-576a-42ec-aeb8-a7514f435d7f" containerName="registry-server" probeResult="failure" output=< Dec 04 10:23:25 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Dec 04 10:23:25 crc kubenswrapper[4693]: > Dec 04 10:23:33 crc kubenswrapper[4693]: I1204 10:23:33.309154 4693 generic.go:334] "Generic (PLEG): container finished" podID="cbd12578-e1a3-41b0-95be-2162e189daae" containerID="46e254990fb0897adfd389ff18d409debf45cf3f4813030d47aa943a1cafccfa" exitCode=0 Dec 04 10:23:33 crc kubenswrapper[4693]: I1204 10:23:33.309247 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" event={"ID":"cbd12578-e1a3-41b0-95be-2162e189daae","Type":"ContainerDied","Data":"46e254990fb0897adfd389ff18d409debf45cf3f4813030d47aa943a1cafccfa"} Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.469319 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:23:34 crc kubenswrapper[4693]: E1204 10:23:34.469903 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.728082 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.766411 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-64n4x" Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.814174 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-64n4x" Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.857649 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-neutron-metadata-combined-ca-bundle\") pod \"cbd12578-e1a3-41b0-95be-2162e189daae\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.857820 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-ssh-key\") pod \"cbd12578-e1a3-41b0-95be-2162e189daae\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.857861 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-neutron-ovn-metadata-agent-neutron-config-0\") pod \"cbd12578-e1a3-41b0-95be-2162e189daae\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.857926 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99f67\" (UniqueName: \"kubernetes.io/projected/cbd12578-e1a3-41b0-95be-2162e189daae-kube-api-access-99f67\") pod \"cbd12578-e1a3-41b0-95be-2162e189daae\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.857963 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-inventory\") pod \"cbd12578-e1a3-41b0-95be-2162e189daae\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.858073 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-nova-metadata-neutron-config-0\") pod \"cbd12578-e1a3-41b0-95be-2162e189daae\" (UID: \"cbd12578-e1a3-41b0-95be-2162e189daae\") " Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.865867 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbd12578-e1a3-41b0-95be-2162e189daae-kube-api-access-99f67" (OuterVolumeSpecName: "kube-api-access-99f67") pod "cbd12578-e1a3-41b0-95be-2162e189daae" (UID: "cbd12578-e1a3-41b0-95be-2162e189daae"). InnerVolumeSpecName "kube-api-access-99f67". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.872505 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "cbd12578-e1a3-41b0-95be-2162e189daae" (UID: "cbd12578-e1a3-41b0-95be-2162e189daae"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.899688 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "cbd12578-e1a3-41b0-95be-2162e189daae" (UID: "cbd12578-e1a3-41b0-95be-2162e189daae"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.902759 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-inventory" (OuterVolumeSpecName: "inventory") pod "cbd12578-e1a3-41b0-95be-2162e189daae" (UID: "cbd12578-e1a3-41b0-95be-2162e189daae"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.913664 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "cbd12578-e1a3-41b0-95be-2162e189daae" (UID: "cbd12578-e1a3-41b0-95be-2162e189daae"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.928062 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cbd12578-e1a3-41b0-95be-2162e189daae" (UID: "cbd12578-e1a3-41b0-95be-2162e189daae"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.961097 4693 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.961147 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99f67\" (UniqueName: \"kubernetes.io/projected/cbd12578-e1a3-41b0-95be-2162e189daae-kube-api-access-99f67\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.961162 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.961174 4693 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.961185 4693 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:34 crc kubenswrapper[4693]: I1204 10:23:34.961194 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbd12578-e1a3-41b0-95be-2162e189daae-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.010190 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-64n4x"] Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.329069 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" event={"ID":"cbd12578-e1a3-41b0-95be-2162e189daae","Type":"ContainerDied","Data":"9bdfc04a6adce82e72d3f441be744b4e06884927e1f52aaf75d3621dfe141b79"} Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.329515 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bdfc04a6adce82e72d3f441be744b4e06884927e1f52aaf75d3621dfe141b79" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.329182 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.433712 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p"] Dec 04 10:23:35 crc kubenswrapper[4693]: E1204 10:23:35.434131 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd12578-e1a3-41b0-95be-2162e189daae" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.434149 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd12578-e1a3-41b0-95be-2162e189daae" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.434351 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd12578-e1a3-41b0-95be-2162e189daae" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.436827 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.439863 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.440037 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.440319 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzd9z" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.440484 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.440526 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.456892 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p"] Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.576160 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.577038 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.577090 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.577136 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znmxr\" (UniqueName: \"kubernetes.io/projected/a943a73c-465d-4a30-be17-967c79007a91-kube-api-access-znmxr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.577182 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.679614 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.679791 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.679830 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.679874 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znmxr\" (UniqueName: \"kubernetes.io/projected/a943a73c-465d-4a30-be17-967c79007a91-kube-api-access-znmxr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.679918 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.686663 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.687226 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.687992 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.701501 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.712853 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znmxr\" (UniqueName: \"kubernetes.io/projected/a943a73c-465d-4a30-be17-967c79007a91-kube-api-access-znmxr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" Dec 04 10:23:35 crc kubenswrapper[4693]: I1204 10:23:35.764900 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" Dec 04 10:23:36 crc kubenswrapper[4693]: I1204 10:23:36.338557 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-64n4x" podUID="91dbd7f5-576a-42ec-aeb8-a7514f435d7f" containerName="registry-server" containerID="cri-o://4e0d368351024f0468a7ebd7765f02957c69709da36f48a784aa1d781edd20d9" gracePeriod=2 Dec 04 10:23:36 crc kubenswrapper[4693]: I1204 10:23:36.374777 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p"] Dec 04 10:23:36 crc kubenswrapper[4693]: W1204 10:23:36.379458 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda943a73c_465d_4a30_be17_967c79007a91.slice/crio-f01ddb0f3b898ce894dd6930fd02204f79ee318c97d356ef14e8861cdf8c475f WatchSource:0}: Error finding container f01ddb0f3b898ce894dd6930fd02204f79ee318c97d356ef14e8861cdf8c475f: Status 404 returned error can't find the container with id f01ddb0f3b898ce894dd6930fd02204f79ee318c97d356ef14e8861cdf8c475f Dec 04 10:23:36 crc kubenswrapper[4693]: I1204 10:23:36.966055 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64n4x" Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.111599 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91dbd7f5-576a-42ec-aeb8-a7514f435d7f-catalog-content\") pod \"91dbd7f5-576a-42ec-aeb8-a7514f435d7f\" (UID: \"91dbd7f5-576a-42ec-aeb8-a7514f435d7f\") " Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.111922 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlfn2\" (UniqueName: \"kubernetes.io/projected/91dbd7f5-576a-42ec-aeb8-a7514f435d7f-kube-api-access-hlfn2\") pod \"91dbd7f5-576a-42ec-aeb8-a7514f435d7f\" (UID: \"91dbd7f5-576a-42ec-aeb8-a7514f435d7f\") " Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.112504 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91dbd7f5-576a-42ec-aeb8-a7514f435d7f-utilities\") pod \"91dbd7f5-576a-42ec-aeb8-a7514f435d7f\" (UID: \"91dbd7f5-576a-42ec-aeb8-a7514f435d7f\") " Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.116124 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91dbd7f5-576a-42ec-aeb8-a7514f435d7f-kube-api-access-hlfn2" (OuterVolumeSpecName: "kube-api-access-hlfn2") pod "91dbd7f5-576a-42ec-aeb8-a7514f435d7f" (UID: "91dbd7f5-576a-42ec-aeb8-a7514f435d7f"). InnerVolumeSpecName "kube-api-access-hlfn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.118952 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91dbd7f5-576a-42ec-aeb8-a7514f435d7f-utilities" (OuterVolumeSpecName: "utilities") pod "91dbd7f5-576a-42ec-aeb8-a7514f435d7f" (UID: "91dbd7f5-576a-42ec-aeb8-a7514f435d7f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.209057 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91dbd7f5-576a-42ec-aeb8-a7514f435d7f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91dbd7f5-576a-42ec-aeb8-a7514f435d7f" (UID: "91dbd7f5-576a-42ec-aeb8-a7514f435d7f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.215042 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91dbd7f5-576a-42ec-aeb8-a7514f435d7f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.215106 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91dbd7f5-576a-42ec-aeb8-a7514f435d7f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.215122 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlfn2\" (UniqueName: \"kubernetes.io/projected/91dbd7f5-576a-42ec-aeb8-a7514f435d7f-kube-api-access-hlfn2\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.347824 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" event={"ID":"a943a73c-465d-4a30-be17-967c79007a91","Type":"ContainerStarted","Data":"afe8de03e2463e613b1eaa6aaa85d4a4301338921473815e645a4e24000d3567"} Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.347876 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" event={"ID":"a943a73c-465d-4a30-be17-967c79007a91","Type":"ContainerStarted","Data":"f01ddb0f3b898ce894dd6930fd02204f79ee318c97d356ef14e8861cdf8c475f"} Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.350142 4693 generic.go:334] "Generic (PLEG): container finished" podID="91dbd7f5-576a-42ec-aeb8-a7514f435d7f" containerID="4e0d368351024f0468a7ebd7765f02957c69709da36f48a784aa1d781edd20d9" exitCode=0 Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.350174 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64n4x" event={"ID":"91dbd7f5-576a-42ec-aeb8-a7514f435d7f","Type":"ContainerDied","Data":"4e0d368351024f0468a7ebd7765f02957c69709da36f48a784aa1d781edd20d9"} Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.350199 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64n4x" event={"ID":"91dbd7f5-576a-42ec-aeb8-a7514f435d7f","Type":"ContainerDied","Data":"bb083d51b03b1363341cc4ce9eca91e04974ee9c681ddc058e27d8cd5a45bfac"} Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.350216 4693 scope.go:117] "RemoveContainer" containerID="4e0d368351024f0468a7ebd7765f02957c69709da36f48a784aa1d781edd20d9" Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.350245 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64n4x" Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.368179 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" podStartSLOduration=1.91315017 podStartE2EDuration="2.368157266s" podCreationTimestamp="2025-12-04 10:23:35 +0000 UTC" firstStartedPulling="2025-12-04 10:23:36.382968009 +0000 UTC m=+2462.280561762" lastFinishedPulling="2025-12-04 10:23:36.837975105 +0000 UTC m=+2462.735568858" observedRunningTime="2025-12-04 10:23:37.366580664 +0000 UTC m=+2463.264174437" watchObservedRunningTime="2025-12-04 10:23:37.368157266 +0000 UTC m=+2463.265751019" Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.386878 4693 scope.go:117] "RemoveContainer" containerID="3f3533bb37b27fef4c2ff466575ac45978a02889d83cce6d000bcc9047792f35" Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.390169 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-64n4x"] Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.400321 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-64n4x"] Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.406456 4693 scope.go:117] "RemoveContainer" containerID="31e4f136b3414b67532b9061e1dc8894c3d90bb7d84f601821d1369f51152f63" Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.427116 4693 scope.go:117] "RemoveContainer" containerID="4e0d368351024f0468a7ebd7765f02957c69709da36f48a784aa1d781edd20d9" Dec 04 10:23:37 crc kubenswrapper[4693]: E1204 10:23:37.427614 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e0d368351024f0468a7ebd7765f02957c69709da36f48a784aa1d781edd20d9\": container with ID starting with 4e0d368351024f0468a7ebd7765f02957c69709da36f48a784aa1d781edd20d9 not found: ID does not exist" containerID="4e0d368351024f0468a7ebd7765f02957c69709da36f48a784aa1d781edd20d9" Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.427660 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e0d368351024f0468a7ebd7765f02957c69709da36f48a784aa1d781edd20d9"} err="failed to get container status \"4e0d368351024f0468a7ebd7765f02957c69709da36f48a784aa1d781edd20d9\": rpc error: code = NotFound desc = could not find container \"4e0d368351024f0468a7ebd7765f02957c69709da36f48a784aa1d781edd20d9\": container with ID starting with 4e0d368351024f0468a7ebd7765f02957c69709da36f48a784aa1d781edd20d9 not found: ID does not exist" Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.427688 4693 scope.go:117] "RemoveContainer" containerID="3f3533bb37b27fef4c2ff466575ac45978a02889d83cce6d000bcc9047792f35" Dec 04 10:23:37 crc kubenswrapper[4693]: E1204 10:23:37.428252 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f3533bb37b27fef4c2ff466575ac45978a02889d83cce6d000bcc9047792f35\": container with ID starting with 3f3533bb37b27fef4c2ff466575ac45978a02889d83cce6d000bcc9047792f35 not found: ID does not exist" containerID="3f3533bb37b27fef4c2ff466575ac45978a02889d83cce6d000bcc9047792f35" Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.428284 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f3533bb37b27fef4c2ff466575ac45978a02889d83cce6d000bcc9047792f35"} err="failed to get container status \"3f3533bb37b27fef4c2ff466575ac45978a02889d83cce6d000bcc9047792f35\": rpc error: code = NotFound desc = could not find container \"3f3533bb37b27fef4c2ff466575ac45978a02889d83cce6d000bcc9047792f35\": container with ID starting with 3f3533bb37b27fef4c2ff466575ac45978a02889d83cce6d000bcc9047792f35 not found: ID does not exist" Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.428310 4693 scope.go:117] "RemoveContainer" containerID="31e4f136b3414b67532b9061e1dc8894c3d90bb7d84f601821d1369f51152f63" Dec 04 10:23:37 crc kubenswrapper[4693]: E1204 10:23:37.429035 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31e4f136b3414b67532b9061e1dc8894c3d90bb7d84f601821d1369f51152f63\": container with ID starting with 31e4f136b3414b67532b9061e1dc8894c3d90bb7d84f601821d1369f51152f63 not found: ID does not exist" containerID="31e4f136b3414b67532b9061e1dc8894c3d90bb7d84f601821d1369f51152f63" Dec 04 10:23:37 crc kubenswrapper[4693]: I1204 10:23:37.429066 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31e4f136b3414b67532b9061e1dc8894c3d90bb7d84f601821d1369f51152f63"} err="failed to get container status \"31e4f136b3414b67532b9061e1dc8894c3d90bb7d84f601821d1369f51152f63\": rpc error: code = NotFound desc = could not find container \"31e4f136b3414b67532b9061e1dc8894c3d90bb7d84f601821d1369f51152f63\": container with ID starting with 31e4f136b3414b67532b9061e1dc8894c3d90bb7d84f601821d1369f51152f63 not found: ID does not exist" Dec 04 10:23:38 crc kubenswrapper[4693]: I1204 10:23:38.472169 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91dbd7f5-576a-42ec-aeb8-a7514f435d7f" path="/var/lib/kubelet/pods/91dbd7f5-576a-42ec-aeb8-a7514f435d7f/volumes" Dec 04 10:23:45 crc kubenswrapper[4693]: I1204 10:23:45.462216 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:23:45 crc kubenswrapper[4693]: E1204 10:23:45.463537 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:23:57 crc kubenswrapper[4693]: I1204 10:23:57.461962 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:23:57 crc kubenswrapper[4693]: E1204 10:23:57.462720 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:24:08 crc kubenswrapper[4693]: I1204 10:24:08.461620 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:24:08 crc kubenswrapper[4693]: E1204 10:24:08.464295 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:24:23 crc kubenswrapper[4693]: I1204 10:24:23.461645 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:24:23 crc kubenswrapper[4693]: E1204 10:24:23.462370 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:24:35 crc kubenswrapper[4693]: I1204 10:24:35.461438 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:24:35 crc kubenswrapper[4693]: E1204 10:24:35.462420 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:24:50 crc kubenswrapper[4693]: I1204 10:24:50.461768 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:24:50 crc kubenswrapper[4693]: E1204 10:24:50.462594 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:25:02 crc kubenswrapper[4693]: I1204 10:25:02.461685 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:25:02 crc kubenswrapper[4693]: E1204 10:25:02.462690 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:25:15 crc kubenswrapper[4693]: I1204 10:25:15.461131 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:25:15 crc kubenswrapper[4693]: E1204 10:25:15.461876 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:25:26 crc kubenswrapper[4693]: I1204 10:25:26.461348 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:25:26 crc kubenswrapper[4693]: E1204 10:25:26.462629 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:25:40 crc kubenswrapper[4693]: I1204 10:25:40.462268 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:25:40 crc kubenswrapper[4693]: E1204 10:25:40.463521 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:25:54 crc kubenswrapper[4693]: I1204 10:25:54.468008 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:25:54 crc kubenswrapper[4693]: E1204 10:25:54.468886 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:26:10 crc kubenswrapper[4693]: I1204 10:26:10.461997 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:26:10 crc kubenswrapper[4693]: E1204 10:26:10.462787 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:26:21 crc kubenswrapper[4693]: I1204 10:26:21.461672 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:26:21 crc kubenswrapper[4693]: E1204 10:26:21.462491 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:26:35 crc kubenswrapper[4693]: I1204 10:26:35.461166 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:26:35 crc kubenswrapper[4693]: E1204 10:26:35.462280 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:26:47 crc kubenswrapper[4693]: I1204 10:26:47.461981 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:26:47 crc kubenswrapper[4693]: E1204 10:26:47.462852 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:27:01 crc kubenswrapper[4693]: I1204 10:27:01.461207 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:27:01 crc kubenswrapper[4693]: E1204 10:27:01.464264 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:27:13 crc kubenswrapper[4693]: I1204 10:27:13.460939 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:27:13 crc kubenswrapper[4693]: E1204 10:27:13.461779 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:27:28 crc kubenswrapper[4693]: I1204 10:27:28.460893 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:27:28 crc kubenswrapper[4693]: E1204 10:27:28.461786 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:27:42 crc kubenswrapper[4693]: I1204 10:27:42.462602 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:27:42 crc kubenswrapper[4693]: E1204 10:27:42.463388 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:27:55 crc kubenswrapper[4693]: I1204 10:27:55.461944 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:27:56 crc kubenswrapper[4693]: I1204 10:27:56.568354 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"4ad1b467ce33fe682c23baf00fcb3108b9169ff6537845f6de6d424430c256b9"} Dec 04 10:28:03 crc kubenswrapper[4693]: I1204 10:28:03.626924 4693 generic.go:334] "Generic (PLEG): container finished" podID="a943a73c-465d-4a30-be17-967c79007a91" containerID="afe8de03e2463e613b1eaa6aaa85d4a4301338921473815e645a4e24000d3567" exitCode=0 Dec 04 10:28:03 crc kubenswrapper[4693]: I1204 10:28:03.627086 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" event={"ID":"a943a73c-465d-4a30-be17-967c79007a91","Type":"ContainerDied","Data":"afe8de03e2463e613b1eaa6aaa85d4a4301338921473815e645a4e24000d3567"} Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.096168 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.196247 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-libvirt-secret-0\") pod \"a943a73c-465d-4a30-be17-967c79007a91\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.196434 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-inventory\") pod \"a943a73c-465d-4a30-be17-967c79007a91\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.196475 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-ssh-key\") pod \"a943a73c-465d-4a30-be17-967c79007a91\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.196592 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-libvirt-combined-ca-bundle\") pod \"a943a73c-465d-4a30-be17-967c79007a91\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.196644 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znmxr\" (UniqueName: \"kubernetes.io/projected/a943a73c-465d-4a30-be17-967c79007a91-kube-api-access-znmxr\") pod \"a943a73c-465d-4a30-be17-967c79007a91\" (UID: \"a943a73c-465d-4a30-be17-967c79007a91\") " Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.202562 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a943a73c-465d-4a30-be17-967c79007a91" (UID: "a943a73c-465d-4a30-be17-967c79007a91"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.203427 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a943a73c-465d-4a30-be17-967c79007a91-kube-api-access-znmxr" (OuterVolumeSpecName: "kube-api-access-znmxr") pod "a943a73c-465d-4a30-be17-967c79007a91" (UID: "a943a73c-465d-4a30-be17-967c79007a91"). InnerVolumeSpecName "kube-api-access-znmxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.223936 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-inventory" (OuterVolumeSpecName: "inventory") pod "a943a73c-465d-4a30-be17-967c79007a91" (UID: "a943a73c-465d-4a30-be17-967c79007a91"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.225729 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "a943a73c-465d-4a30-be17-967c79007a91" (UID: "a943a73c-465d-4a30-be17-967c79007a91"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.227509 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a943a73c-465d-4a30-be17-967c79007a91" (UID: "a943a73c-465d-4a30-be17-967c79007a91"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.299432 4693 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.299464 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.299474 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.299486 4693 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a943a73c-465d-4a30-be17-967c79007a91-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.299497 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znmxr\" (UniqueName: \"kubernetes.io/projected/a943a73c-465d-4a30-be17-967c79007a91-kube-api-access-znmxr\") on node \"crc\" DevicePath \"\"" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.648767 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" event={"ID":"a943a73c-465d-4a30-be17-967c79007a91","Type":"ContainerDied","Data":"f01ddb0f3b898ce894dd6930fd02204f79ee318c97d356ef14e8861cdf8c475f"} Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.648815 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f01ddb0f3b898ce894dd6930fd02204f79ee318c97d356ef14e8861cdf8c475f" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.649112 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.763716 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6"] Dec 04 10:28:05 crc kubenswrapper[4693]: E1204 10:28:05.764235 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91dbd7f5-576a-42ec-aeb8-a7514f435d7f" containerName="registry-server" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.764258 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="91dbd7f5-576a-42ec-aeb8-a7514f435d7f" containerName="registry-server" Dec 04 10:28:05 crc kubenswrapper[4693]: E1204 10:28:05.764296 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a943a73c-465d-4a30-be17-967c79007a91" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.764306 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a943a73c-465d-4a30-be17-967c79007a91" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 04 10:28:05 crc kubenswrapper[4693]: E1204 10:28:05.764325 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91dbd7f5-576a-42ec-aeb8-a7514f435d7f" containerName="extract-utilities" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.764356 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="91dbd7f5-576a-42ec-aeb8-a7514f435d7f" containerName="extract-utilities" Dec 04 10:28:05 crc kubenswrapper[4693]: E1204 10:28:05.764375 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91dbd7f5-576a-42ec-aeb8-a7514f435d7f" containerName="extract-content" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.764384 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="91dbd7f5-576a-42ec-aeb8-a7514f435d7f" containerName="extract-content" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.764685 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a943a73c-465d-4a30-be17-967c79007a91" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.764708 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="91dbd7f5-576a-42ec-aeb8-a7514f435d7f" containerName="registry-server" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.765442 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.768321 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzd9z" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.768842 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.768954 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.769112 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.769227 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.772182 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.774553 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.786699 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6"] Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.910454 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.910918 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n48vp\" (UniqueName: \"kubernetes.io/projected/50c8246f-670e-4056-9c35-19e8042a96bf-kube-api-access-n48vp\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.910970 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.910987 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.911018 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.911107 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.911170 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.911390 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/50c8246f-670e-4056-9c35-19e8042a96bf-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:05 crc kubenswrapper[4693]: I1204 10:28:05.911465 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.013267 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n48vp\" (UniqueName: \"kubernetes.io/projected/50c8246f-670e-4056-9c35-19e8042a96bf-kube-api-access-n48vp\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.013353 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.013374 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.013401 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.013431 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.013457 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.013501 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/50c8246f-670e-4056-9c35-19e8042a96bf-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.013526 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.013544 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.014369 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/50c8246f-670e-4056-9c35-19e8042a96bf-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.018145 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.019508 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.019874 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.019917 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.020732 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.027513 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.029427 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.030440 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n48vp\" (UniqueName: \"kubernetes.io/projected/50c8246f-670e-4056-9c35-19e8042a96bf-kube-api-access-n48vp\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jj8z6\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.088536 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.587811 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6"] Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.593452 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:28:06 crc kubenswrapper[4693]: I1204 10:28:06.658979 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" event={"ID":"50c8246f-670e-4056-9c35-19e8042a96bf","Type":"ContainerStarted","Data":"dd134a293dfe3fbd965e61aee3145cec7a50dcad9fe3d0bb3cf23ff36cd80385"} Dec 04 10:28:07 crc kubenswrapper[4693]: I1204 10:28:07.673078 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" event={"ID":"50c8246f-670e-4056-9c35-19e8042a96bf","Type":"ContainerStarted","Data":"d4b3023e8e0629e58e1c821b951298f1d64dc8181eaff669a41f670efd10f507"} Dec 04 10:28:07 crc kubenswrapper[4693]: I1204 10:28:07.702869 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" podStartSLOduration=2.167653316 podStartE2EDuration="2.702841806s" podCreationTimestamp="2025-12-04 10:28:05 +0000 UTC" firstStartedPulling="2025-12-04 10:28:06.593202743 +0000 UTC m=+2732.490796496" lastFinishedPulling="2025-12-04 10:28:07.128391233 +0000 UTC m=+2733.025984986" observedRunningTime="2025-12-04 10:28:07.693754402 +0000 UTC m=+2733.591348165" watchObservedRunningTime="2025-12-04 10:28:07.702841806 +0000 UTC m=+2733.600435579" Dec 04 10:29:15 crc kubenswrapper[4693]: I1204 10:29:15.722993 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5tmbf"] Dec 04 10:29:15 crc kubenswrapper[4693]: I1204 10:29:15.726173 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tmbf" Dec 04 10:29:15 crc kubenswrapper[4693]: I1204 10:29:15.748224 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5tmbf"] Dec 04 10:29:15 crc kubenswrapper[4693]: I1204 10:29:15.850554 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67efc455-46b6-4224-a4e0-44907d0149e2-utilities\") pod \"community-operators-5tmbf\" (UID: \"67efc455-46b6-4224-a4e0-44907d0149e2\") " pod="openshift-marketplace/community-operators-5tmbf" Dec 04 10:29:15 crc kubenswrapper[4693]: I1204 10:29:15.850673 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tjbg\" (UniqueName: \"kubernetes.io/projected/67efc455-46b6-4224-a4e0-44907d0149e2-kube-api-access-7tjbg\") pod \"community-operators-5tmbf\" (UID: \"67efc455-46b6-4224-a4e0-44907d0149e2\") " pod="openshift-marketplace/community-operators-5tmbf" Dec 04 10:29:15 crc kubenswrapper[4693]: I1204 10:29:15.850719 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67efc455-46b6-4224-a4e0-44907d0149e2-catalog-content\") pod \"community-operators-5tmbf\" (UID: \"67efc455-46b6-4224-a4e0-44907d0149e2\") " pod="openshift-marketplace/community-operators-5tmbf" Dec 04 10:29:15 crc kubenswrapper[4693]: I1204 10:29:15.952111 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67efc455-46b6-4224-a4e0-44907d0149e2-utilities\") pod \"community-operators-5tmbf\" (UID: \"67efc455-46b6-4224-a4e0-44907d0149e2\") " pod="openshift-marketplace/community-operators-5tmbf" Dec 04 10:29:15 crc kubenswrapper[4693]: I1204 10:29:15.952229 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tjbg\" (UniqueName: \"kubernetes.io/projected/67efc455-46b6-4224-a4e0-44907d0149e2-kube-api-access-7tjbg\") pod \"community-operators-5tmbf\" (UID: \"67efc455-46b6-4224-a4e0-44907d0149e2\") " pod="openshift-marketplace/community-operators-5tmbf" Dec 04 10:29:15 crc kubenswrapper[4693]: I1204 10:29:15.952279 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67efc455-46b6-4224-a4e0-44907d0149e2-catalog-content\") pod \"community-operators-5tmbf\" (UID: \"67efc455-46b6-4224-a4e0-44907d0149e2\") " pod="openshift-marketplace/community-operators-5tmbf" Dec 04 10:29:15 crc kubenswrapper[4693]: I1204 10:29:15.952540 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67efc455-46b6-4224-a4e0-44907d0149e2-utilities\") pod \"community-operators-5tmbf\" (UID: \"67efc455-46b6-4224-a4e0-44907d0149e2\") " pod="openshift-marketplace/community-operators-5tmbf" Dec 04 10:29:15 crc kubenswrapper[4693]: I1204 10:29:15.952588 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67efc455-46b6-4224-a4e0-44907d0149e2-catalog-content\") pod \"community-operators-5tmbf\" (UID: \"67efc455-46b6-4224-a4e0-44907d0149e2\") " pod="openshift-marketplace/community-operators-5tmbf" Dec 04 10:29:15 crc kubenswrapper[4693]: I1204 10:29:15.984228 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tjbg\" (UniqueName: \"kubernetes.io/projected/67efc455-46b6-4224-a4e0-44907d0149e2-kube-api-access-7tjbg\") pod \"community-operators-5tmbf\" (UID: \"67efc455-46b6-4224-a4e0-44907d0149e2\") " pod="openshift-marketplace/community-operators-5tmbf" Dec 04 10:29:16 crc kubenswrapper[4693]: I1204 10:29:16.056645 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tmbf" Dec 04 10:29:16 crc kubenswrapper[4693]: I1204 10:29:16.395439 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5tmbf"] Dec 04 10:29:17 crc kubenswrapper[4693]: I1204 10:29:17.250502 4693 generic.go:334] "Generic (PLEG): container finished" podID="67efc455-46b6-4224-a4e0-44907d0149e2" containerID="541994dda0dd273c751aa6b8c80a631293248359d504d9e286fd781cbe92320c" exitCode=0 Dec 04 10:29:17 crc kubenswrapper[4693]: I1204 10:29:17.250609 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tmbf" event={"ID":"67efc455-46b6-4224-a4e0-44907d0149e2","Type":"ContainerDied","Data":"541994dda0dd273c751aa6b8c80a631293248359d504d9e286fd781cbe92320c"} Dec 04 10:29:17 crc kubenswrapper[4693]: I1204 10:29:17.251006 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tmbf" event={"ID":"67efc455-46b6-4224-a4e0-44907d0149e2","Type":"ContainerStarted","Data":"0972c65fdf29ac4cae03f09d8cd80292b39c872d19bdac3cbf3d38c238af1a2d"} Dec 04 10:29:18 crc kubenswrapper[4693]: I1204 10:29:18.287314 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tmbf" event={"ID":"67efc455-46b6-4224-a4e0-44907d0149e2","Type":"ContainerStarted","Data":"03384140fdcb5d7415cba35956f71dc6da2db5c222895a9e69861a421b8800f5"} Dec 04 10:29:19 crc kubenswrapper[4693]: I1204 10:29:19.300400 4693 generic.go:334] "Generic (PLEG): container finished" podID="67efc455-46b6-4224-a4e0-44907d0149e2" containerID="03384140fdcb5d7415cba35956f71dc6da2db5c222895a9e69861a421b8800f5" exitCode=0 Dec 04 10:29:19 crc kubenswrapper[4693]: I1204 10:29:19.300500 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tmbf" event={"ID":"67efc455-46b6-4224-a4e0-44907d0149e2","Type":"ContainerDied","Data":"03384140fdcb5d7415cba35956f71dc6da2db5c222895a9e69861a421b8800f5"} Dec 04 10:29:20 crc kubenswrapper[4693]: I1204 10:29:20.312929 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tmbf" event={"ID":"67efc455-46b6-4224-a4e0-44907d0149e2","Type":"ContainerStarted","Data":"429120ad8e8d74a8b4abfcda44307fa51a1bc05812b2e36b9a5a9f97cdb85a8f"} Dec 04 10:29:20 crc kubenswrapper[4693]: I1204 10:29:20.329594 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5tmbf" podStartSLOduration=2.6154139880000002 podStartE2EDuration="5.329560825s" podCreationTimestamp="2025-12-04 10:29:15 +0000 UTC" firstStartedPulling="2025-12-04 10:29:17.253508373 +0000 UTC m=+2803.151102126" lastFinishedPulling="2025-12-04 10:29:19.96765521 +0000 UTC m=+2805.865248963" observedRunningTime="2025-12-04 10:29:20.327073338 +0000 UTC m=+2806.224667091" watchObservedRunningTime="2025-12-04 10:29:20.329560825 +0000 UTC m=+2806.227154578" Dec 04 10:29:26 crc kubenswrapper[4693]: I1204 10:29:26.057074 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5tmbf" Dec 04 10:29:26 crc kubenswrapper[4693]: I1204 10:29:26.057671 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5tmbf" Dec 04 10:29:26 crc kubenswrapper[4693]: I1204 10:29:26.152955 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5tmbf" Dec 04 10:29:26 crc kubenswrapper[4693]: I1204 10:29:26.413204 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5tmbf" Dec 04 10:29:26 crc kubenswrapper[4693]: I1204 10:29:26.459785 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5tmbf"] Dec 04 10:29:28 crc kubenswrapper[4693]: I1204 10:29:28.385995 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5tmbf" podUID="67efc455-46b6-4224-a4e0-44907d0149e2" containerName="registry-server" containerID="cri-o://429120ad8e8d74a8b4abfcda44307fa51a1bc05812b2e36b9a5a9f97cdb85a8f" gracePeriod=2 Dec 04 10:29:28 crc kubenswrapper[4693]: I1204 10:29:28.875933 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tmbf" Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.029498 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tjbg\" (UniqueName: \"kubernetes.io/projected/67efc455-46b6-4224-a4e0-44907d0149e2-kube-api-access-7tjbg\") pod \"67efc455-46b6-4224-a4e0-44907d0149e2\" (UID: \"67efc455-46b6-4224-a4e0-44907d0149e2\") " Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.029612 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67efc455-46b6-4224-a4e0-44907d0149e2-utilities\") pod \"67efc455-46b6-4224-a4e0-44907d0149e2\" (UID: \"67efc455-46b6-4224-a4e0-44907d0149e2\") " Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.029765 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67efc455-46b6-4224-a4e0-44907d0149e2-catalog-content\") pod \"67efc455-46b6-4224-a4e0-44907d0149e2\" (UID: \"67efc455-46b6-4224-a4e0-44907d0149e2\") " Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.030923 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67efc455-46b6-4224-a4e0-44907d0149e2-utilities" (OuterVolumeSpecName: "utilities") pod "67efc455-46b6-4224-a4e0-44907d0149e2" (UID: "67efc455-46b6-4224-a4e0-44907d0149e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.035534 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67efc455-46b6-4224-a4e0-44907d0149e2-kube-api-access-7tjbg" (OuterVolumeSpecName: "kube-api-access-7tjbg") pod "67efc455-46b6-4224-a4e0-44907d0149e2" (UID: "67efc455-46b6-4224-a4e0-44907d0149e2"). InnerVolumeSpecName "kube-api-access-7tjbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.083667 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67efc455-46b6-4224-a4e0-44907d0149e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67efc455-46b6-4224-a4e0-44907d0149e2" (UID: "67efc455-46b6-4224-a4e0-44907d0149e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.131861 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tjbg\" (UniqueName: \"kubernetes.io/projected/67efc455-46b6-4224-a4e0-44907d0149e2-kube-api-access-7tjbg\") on node \"crc\" DevicePath \"\"" Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.131899 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67efc455-46b6-4224-a4e0-44907d0149e2-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.131910 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67efc455-46b6-4224-a4e0-44907d0149e2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.395840 4693 generic.go:334] "Generic (PLEG): container finished" podID="67efc455-46b6-4224-a4e0-44907d0149e2" containerID="429120ad8e8d74a8b4abfcda44307fa51a1bc05812b2e36b9a5a9f97cdb85a8f" exitCode=0 Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.395906 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tmbf" Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.395886 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tmbf" event={"ID":"67efc455-46b6-4224-a4e0-44907d0149e2","Type":"ContainerDied","Data":"429120ad8e8d74a8b4abfcda44307fa51a1bc05812b2e36b9a5a9f97cdb85a8f"} Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.396038 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tmbf" event={"ID":"67efc455-46b6-4224-a4e0-44907d0149e2","Type":"ContainerDied","Data":"0972c65fdf29ac4cae03f09d8cd80292b39c872d19bdac3cbf3d38c238af1a2d"} Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.396062 4693 scope.go:117] "RemoveContainer" containerID="429120ad8e8d74a8b4abfcda44307fa51a1bc05812b2e36b9a5a9f97cdb85a8f" Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.416727 4693 scope.go:117] "RemoveContainer" containerID="03384140fdcb5d7415cba35956f71dc6da2db5c222895a9e69861a421b8800f5" Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.433628 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5tmbf"] Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.442401 4693 scope.go:117] "RemoveContainer" containerID="541994dda0dd273c751aa6b8c80a631293248359d504d9e286fd781cbe92320c" Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.442561 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5tmbf"] Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.492529 4693 scope.go:117] "RemoveContainer" containerID="429120ad8e8d74a8b4abfcda44307fa51a1bc05812b2e36b9a5a9f97cdb85a8f" Dec 04 10:29:29 crc kubenswrapper[4693]: E1204 10:29:29.493591 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"429120ad8e8d74a8b4abfcda44307fa51a1bc05812b2e36b9a5a9f97cdb85a8f\": container with ID starting with 429120ad8e8d74a8b4abfcda44307fa51a1bc05812b2e36b9a5a9f97cdb85a8f not found: ID does not exist" containerID="429120ad8e8d74a8b4abfcda44307fa51a1bc05812b2e36b9a5a9f97cdb85a8f" Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.493647 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"429120ad8e8d74a8b4abfcda44307fa51a1bc05812b2e36b9a5a9f97cdb85a8f"} err="failed to get container status \"429120ad8e8d74a8b4abfcda44307fa51a1bc05812b2e36b9a5a9f97cdb85a8f\": rpc error: code = NotFound desc = could not find container \"429120ad8e8d74a8b4abfcda44307fa51a1bc05812b2e36b9a5a9f97cdb85a8f\": container with ID starting with 429120ad8e8d74a8b4abfcda44307fa51a1bc05812b2e36b9a5a9f97cdb85a8f not found: ID does not exist" Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.493675 4693 scope.go:117] "RemoveContainer" containerID="03384140fdcb5d7415cba35956f71dc6da2db5c222895a9e69861a421b8800f5" Dec 04 10:29:29 crc kubenswrapper[4693]: E1204 10:29:29.494045 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03384140fdcb5d7415cba35956f71dc6da2db5c222895a9e69861a421b8800f5\": container with ID starting with 03384140fdcb5d7415cba35956f71dc6da2db5c222895a9e69861a421b8800f5 not found: ID does not exist" containerID="03384140fdcb5d7415cba35956f71dc6da2db5c222895a9e69861a421b8800f5" Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.494089 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03384140fdcb5d7415cba35956f71dc6da2db5c222895a9e69861a421b8800f5"} err="failed to get container status \"03384140fdcb5d7415cba35956f71dc6da2db5c222895a9e69861a421b8800f5\": rpc error: code = NotFound desc = could not find container \"03384140fdcb5d7415cba35956f71dc6da2db5c222895a9e69861a421b8800f5\": container with ID starting with 03384140fdcb5d7415cba35956f71dc6da2db5c222895a9e69861a421b8800f5 not found: ID does not exist" Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.494119 4693 scope.go:117] "RemoveContainer" containerID="541994dda0dd273c751aa6b8c80a631293248359d504d9e286fd781cbe92320c" Dec 04 10:29:29 crc kubenswrapper[4693]: E1204 10:29:29.494322 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"541994dda0dd273c751aa6b8c80a631293248359d504d9e286fd781cbe92320c\": container with ID starting with 541994dda0dd273c751aa6b8c80a631293248359d504d9e286fd781cbe92320c not found: ID does not exist" containerID="541994dda0dd273c751aa6b8c80a631293248359d504d9e286fd781cbe92320c" Dec 04 10:29:29 crc kubenswrapper[4693]: I1204 10:29:29.494363 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"541994dda0dd273c751aa6b8c80a631293248359d504d9e286fd781cbe92320c"} err="failed to get container status \"541994dda0dd273c751aa6b8c80a631293248359d504d9e286fd781cbe92320c\": rpc error: code = NotFound desc = could not find container \"541994dda0dd273c751aa6b8c80a631293248359d504d9e286fd781cbe92320c\": container with ID starting with 541994dda0dd273c751aa6b8c80a631293248359d504d9e286fd781cbe92320c not found: ID does not exist" Dec 04 10:29:30 crc kubenswrapper[4693]: I1204 10:29:30.472441 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67efc455-46b6-4224-a4e0-44907d0149e2" path="/var/lib/kubelet/pods/67efc455-46b6-4224-a4e0-44907d0149e2/volumes" Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.162172 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx"] Dec 04 10:30:00 crc kubenswrapper[4693]: E1204 10:30:00.163258 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67efc455-46b6-4224-a4e0-44907d0149e2" containerName="registry-server" Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.163277 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="67efc455-46b6-4224-a4e0-44907d0149e2" containerName="registry-server" Dec 04 10:30:00 crc kubenswrapper[4693]: E1204 10:30:00.163289 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67efc455-46b6-4224-a4e0-44907d0149e2" containerName="extract-utilities" Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.163298 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="67efc455-46b6-4224-a4e0-44907d0149e2" containerName="extract-utilities" Dec 04 10:30:00 crc kubenswrapper[4693]: E1204 10:30:00.163350 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67efc455-46b6-4224-a4e0-44907d0149e2" containerName="extract-content" Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.163359 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="67efc455-46b6-4224-a4e0-44907d0149e2" containerName="extract-content" Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.163623 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="67efc455-46b6-4224-a4e0-44907d0149e2" containerName="registry-server" Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.164323 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx" Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.167153 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.167524 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.183357 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx"] Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.284177 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8658c582-4e18-4b63-ae78-81da959895b5-config-volume\") pod \"collect-profiles-29414070-b79nx\" (UID: \"8658c582-4e18-4b63-ae78-81da959895b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx" Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.284473 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28ptf\" (UniqueName: \"kubernetes.io/projected/8658c582-4e18-4b63-ae78-81da959895b5-kube-api-access-28ptf\") pod \"collect-profiles-29414070-b79nx\" (UID: \"8658c582-4e18-4b63-ae78-81da959895b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx" Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.284638 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8658c582-4e18-4b63-ae78-81da959895b5-secret-volume\") pod \"collect-profiles-29414070-b79nx\" (UID: \"8658c582-4e18-4b63-ae78-81da959895b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx" Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.386241 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8658c582-4e18-4b63-ae78-81da959895b5-config-volume\") pod \"collect-profiles-29414070-b79nx\" (UID: \"8658c582-4e18-4b63-ae78-81da959895b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx" Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.386316 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28ptf\" (UniqueName: \"kubernetes.io/projected/8658c582-4e18-4b63-ae78-81da959895b5-kube-api-access-28ptf\") pod \"collect-profiles-29414070-b79nx\" (UID: \"8658c582-4e18-4b63-ae78-81da959895b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx" Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.386396 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8658c582-4e18-4b63-ae78-81da959895b5-secret-volume\") pod \"collect-profiles-29414070-b79nx\" (UID: \"8658c582-4e18-4b63-ae78-81da959895b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx" Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.387258 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8658c582-4e18-4b63-ae78-81da959895b5-config-volume\") pod \"collect-profiles-29414070-b79nx\" (UID: \"8658c582-4e18-4b63-ae78-81da959895b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx" Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.399277 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8658c582-4e18-4b63-ae78-81da959895b5-secret-volume\") pod \"collect-profiles-29414070-b79nx\" (UID: \"8658c582-4e18-4b63-ae78-81da959895b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx" Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.402624 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28ptf\" (UniqueName: \"kubernetes.io/projected/8658c582-4e18-4b63-ae78-81da959895b5-kube-api-access-28ptf\") pod \"collect-profiles-29414070-b79nx\" (UID: \"8658c582-4e18-4b63-ae78-81da959895b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx" Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.489311 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx" Dec 04 10:30:00 crc kubenswrapper[4693]: I1204 10:30:00.981655 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx"] Dec 04 10:30:01 crc kubenswrapper[4693]: I1204 10:30:01.324156 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx" event={"ID":"8658c582-4e18-4b63-ae78-81da959895b5","Type":"ContainerStarted","Data":"56b13829980dacea53c54829f0b24b8652b4da321c853ca0c347efbf0b3e7ee6"} Dec 04 10:30:01 crc kubenswrapper[4693]: I1204 10:30:01.324514 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx" event={"ID":"8658c582-4e18-4b63-ae78-81da959895b5","Type":"ContainerStarted","Data":"607b7f68ae6ad78c400507ddd994b4c069f954dcac2eddd844f843daeefc8b58"} Dec 04 10:30:01 crc kubenswrapper[4693]: I1204 10:30:01.353958 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx" podStartSLOduration=1.353941003 podStartE2EDuration="1.353941003s" podCreationTimestamp="2025-12-04 10:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:30:01.350900172 +0000 UTC m=+2847.248493945" watchObservedRunningTime="2025-12-04 10:30:01.353941003 +0000 UTC m=+2847.251534746" Dec 04 10:30:02 crc kubenswrapper[4693]: I1204 10:30:02.338127 4693 generic.go:334] "Generic (PLEG): container finished" podID="8658c582-4e18-4b63-ae78-81da959895b5" containerID="56b13829980dacea53c54829f0b24b8652b4da321c853ca0c347efbf0b3e7ee6" exitCode=0 Dec 04 10:30:02 crc kubenswrapper[4693]: I1204 10:30:02.338424 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx" event={"ID":"8658c582-4e18-4b63-ae78-81da959895b5","Type":"ContainerDied","Data":"56b13829980dacea53c54829f0b24b8652b4da321c853ca0c347efbf0b3e7ee6"} Dec 04 10:30:03 crc kubenswrapper[4693]: I1204 10:30:03.678906 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx" Dec 04 10:30:03 crc kubenswrapper[4693]: I1204 10:30:03.754993 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28ptf\" (UniqueName: \"kubernetes.io/projected/8658c582-4e18-4b63-ae78-81da959895b5-kube-api-access-28ptf\") pod \"8658c582-4e18-4b63-ae78-81da959895b5\" (UID: \"8658c582-4e18-4b63-ae78-81da959895b5\") " Dec 04 10:30:03 crc kubenswrapper[4693]: I1204 10:30:03.755116 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8658c582-4e18-4b63-ae78-81da959895b5-secret-volume\") pod \"8658c582-4e18-4b63-ae78-81da959895b5\" (UID: \"8658c582-4e18-4b63-ae78-81da959895b5\") " Dec 04 10:30:03 crc kubenswrapper[4693]: I1204 10:30:03.755308 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8658c582-4e18-4b63-ae78-81da959895b5-config-volume\") pod \"8658c582-4e18-4b63-ae78-81da959895b5\" (UID: \"8658c582-4e18-4b63-ae78-81da959895b5\") " Dec 04 10:30:03 crc kubenswrapper[4693]: I1204 10:30:03.756314 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8658c582-4e18-4b63-ae78-81da959895b5-config-volume" (OuterVolumeSpecName: "config-volume") pod "8658c582-4e18-4b63-ae78-81da959895b5" (UID: "8658c582-4e18-4b63-ae78-81da959895b5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:30:03 crc kubenswrapper[4693]: I1204 10:30:03.761628 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8658c582-4e18-4b63-ae78-81da959895b5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8658c582-4e18-4b63-ae78-81da959895b5" (UID: "8658c582-4e18-4b63-ae78-81da959895b5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:30:03 crc kubenswrapper[4693]: I1204 10:30:03.761640 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8658c582-4e18-4b63-ae78-81da959895b5-kube-api-access-28ptf" (OuterVolumeSpecName: "kube-api-access-28ptf") pod "8658c582-4e18-4b63-ae78-81da959895b5" (UID: "8658c582-4e18-4b63-ae78-81da959895b5"). InnerVolumeSpecName "kube-api-access-28ptf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:30:03 crc kubenswrapper[4693]: I1204 10:30:03.857824 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28ptf\" (UniqueName: \"kubernetes.io/projected/8658c582-4e18-4b63-ae78-81da959895b5-kube-api-access-28ptf\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:03 crc kubenswrapper[4693]: I1204 10:30:03.857862 4693 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8658c582-4e18-4b63-ae78-81da959895b5-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:03 crc kubenswrapper[4693]: I1204 10:30:03.857876 4693 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8658c582-4e18-4b63-ae78-81da959895b5-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:04 crc kubenswrapper[4693]: I1204 10:30:04.358519 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx" event={"ID":"8658c582-4e18-4b63-ae78-81da959895b5","Type":"ContainerDied","Data":"607b7f68ae6ad78c400507ddd994b4c069f954dcac2eddd844f843daeefc8b58"} Dec 04 10:30:04 crc kubenswrapper[4693]: I1204 10:30:04.358866 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="607b7f68ae6ad78c400507ddd994b4c069f954dcac2eddd844f843daeefc8b58" Dec 04 10:30:04 crc kubenswrapper[4693]: I1204 10:30:04.358584 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx" Dec 04 10:30:04 crc kubenswrapper[4693]: I1204 10:30:04.429677 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2"] Dec 04 10:30:04 crc kubenswrapper[4693]: I1204 10:30:04.438899 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414025-qsdk2"] Dec 04 10:30:04 crc kubenswrapper[4693]: I1204 10:30:04.478149 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="225b0ea4-5e69-43d9-92b8-e54ff9cef03b" path="/var/lib/kubelet/pods/225b0ea4-5e69-43d9-92b8-e54ff9cef03b/volumes" Dec 04 10:30:10 crc kubenswrapper[4693]: I1204 10:30:10.666810 4693 scope.go:117] "RemoveContainer" containerID="08cabf8ca227bd33145df394354f51c5447c3d07671fbcc8c18c505bcd6da123" Dec 04 10:30:22 crc kubenswrapper[4693]: I1204 10:30:22.272955 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:30:22 crc kubenswrapper[4693]: I1204 10:30:22.273729 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:30:52 crc kubenswrapper[4693]: I1204 10:30:52.272673 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:30:52 crc kubenswrapper[4693]: I1204 10:30:52.273181 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:30:52 crc kubenswrapper[4693]: I1204 10:30:52.768687 4693 generic.go:334] "Generic (PLEG): container finished" podID="50c8246f-670e-4056-9c35-19e8042a96bf" containerID="d4b3023e8e0629e58e1c821b951298f1d64dc8181eaff669a41f670efd10f507" exitCode=0 Dec 04 10:30:52 crc kubenswrapper[4693]: I1204 10:30:52.768790 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" event={"ID":"50c8246f-670e-4056-9c35-19e8042a96bf","Type":"ContainerDied","Data":"d4b3023e8e0629e58e1c821b951298f1d64dc8181eaff669a41f670efd10f507"} Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.200676 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.374400 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-combined-ca-bundle\") pod \"50c8246f-670e-4056-9c35-19e8042a96bf\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.374666 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-migration-ssh-key-1\") pod \"50c8246f-670e-4056-9c35-19e8042a96bf\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.374708 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/50c8246f-670e-4056-9c35-19e8042a96bf-nova-extra-config-0\") pod \"50c8246f-670e-4056-9c35-19e8042a96bf\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.374762 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n48vp\" (UniqueName: \"kubernetes.io/projected/50c8246f-670e-4056-9c35-19e8042a96bf-kube-api-access-n48vp\") pod \"50c8246f-670e-4056-9c35-19e8042a96bf\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.374867 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-cell1-compute-config-1\") pod \"50c8246f-670e-4056-9c35-19e8042a96bf\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.374942 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-inventory\") pod \"50c8246f-670e-4056-9c35-19e8042a96bf\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.374995 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-migration-ssh-key-0\") pod \"50c8246f-670e-4056-9c35-19e8042a96bf\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.375112 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-ssh-key\") pod \"50c8246f-670e-4056-9c35-19e8042a96bf\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.375148 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-cell1-compute-config-0\") pod \"50c8246f-670e-4056-9c35-19e8042a96bf\" (UID: \"50c8246f-670e-4056-9c35-19e8042a96bf\") " Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.387615 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50c8246f-670e-4056-9c35-19e8042a96bf-kube-api-access-n48vp" (OuterVolumeSpecName: "kube-api-access-n48vp") pod "50c8246f-670e-4056-9c35-19e8042a96bf" (UID: "50c8246f-670e-4056-9c35-19e8042a96bf"). InnerVolumeSpecName "kube-api-access-n48vp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.403858 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "50c8246f-670e-4056-9c35-19e8042a96bf" (UID: "50c8246f-670e-4056-9c35-19e8042a96bf"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.412140 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50c8246f-670e-4056-9c35-19e8042a96bf-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "50c8246f-670e-4056-9c35-19e8042a96bf" (UID: "50c8246f-670e-4056-9c35-19e8042a96bf"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.413009 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-inventory" (OuterVolumeSpecName: "inventory") pod "50c8246f-670e-4056-9c35-19e8042a96bf" (UID: "50c8246f-670e-4056-9c35-19e8042a96bf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.413427 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "50c8246f-670e-4056-9c35-19e8042a96bf" (UID: "50c8246f-670e-4056-9c35-19e8042a96bf"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.415257 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "50c8246f-670e-4056-9c35-19e8042a96bf" (UID: "50c8246f-670e-4056-9c35-19e8042a96bf"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.415581 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "50c8246f-670e-4056-9c35-19e8042a96bf" (UID: "50c8246f-670e-4056-9c35-19e8042a96bf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.423027 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "50c8246f-670e-4056-9c35-19e8042a96bf" (UID: "50c8246f-670e-4056-9c35-19e8042a96bf"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.423484 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "50c8246f-670e-4056-9c35-19e8042a96bf" (UID: "50c8246f-670e-4056-9c35-19e8042a96bf"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.479190 4693 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/50c8246f-670e-4056-9c35-19e8042a96bf-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.479441 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n48vp\" (UniqueName: \"kubernetes.io/projected/50c8246f-670e-4056-9c35-19e8042a96bf-kube-api-access-n48vp\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.479551 4693 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.479645 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.479781 4693 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.479845 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.479906 4693 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.479964 4693 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.480027 4693 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/50c8246f-670e-4056-9c35-19e8042a96bf-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.788684 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" event={"ID":"50c8246f-670e-4056-9c35-19e8042a96bf","Type":"ContainerDied","Data":"dd134a293dfe3fbd965e61aee3145cec7a50dcad9fe3d0bb3cf23ff36cd80385"} Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.788993 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd134a293dfe3fbd965e61aee3145cec7a50dcad9fe3d0bb3cf23ff36cd80385" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.788805 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jj8z6" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.888708 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5"] Dec 04 10:30:54 crc kubenswrapper[4693]: E1204 10:30:54.889143 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50c8246f-670e-4056-9c35-19e8042a96bf" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.889161 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="50c8246f-670e-4056-9c35-19e8042a96bf" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 04 10:30:54 crc kubenswrapper[4693]: E1204 10:30:54.889182 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8658c582-4e18-4b63-ae78-81da959895b5" containerName="collect-profiles" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.889190 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="8658c582-4e18-4b63-ae78-81da959895b5" containerName="collect-profiles" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.889406 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="8658c582-4e18-4b63-ae78-81da959895b5" containerName="collect-profiles" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.889427 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="50c8246f-670e-4056-9c35-19e8042a96bf" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.890089 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.891881 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.896049 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzd9z" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.896525 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.896537 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.897050 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.901391 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5"] Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.988468 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv2tf\" (UniqueName: \"kubernetes.io/projected/ba3e6d3f-0285-4742-80ba-4c15da05164c-kube-api-access-gv2tf\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.988545 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.988577 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.988603 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.988662 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.988749 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:54 crc kubenswrapper[4693]: I1204 10:30:54.988869 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:55 crc kubenswrapper[4693]: I1204 10:30:55.090558 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:55 crc kubenswrapper[4693]: I1204 10:30:55.090675 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:55 crc kubenswrapper[4693]: I1204 10:30:55.090717 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:55 crc kubenswrapper[4693]: I1204 10:30:55.090784 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:55 crc kubenswrapper[4693]: I1204 10:30:55.091148 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv2tf\" (UniqueName: \"kubernetes.io/projected/ba3e6d3f-0285-4742-80ba-4c15da05164c-kube-api-access-gv2tf\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:55 crc kubenswrapper[4693]: I1204 10:30:55.091224 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:55 crc kubenswrapper[4693]: I1204 10:30:55.091251 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:55 crc kubenswrapper[4693]: I1204 10:30:55.094753 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:55 crc kubenswrapper[4693]: I1204 10:30:55.094931 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:55 crc kubenswrapper[4693]: I1204 10:30:55.096560 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:55 crc kubenswrapper[4693]: I1204 10:30:55.096899 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:55 crc kubenswrapper[4693]: I1204 10:30:55.097031 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:55 crc kubenswrapper[4693]: I1204 10:30:55.105290 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:55 crc kubenswrapper[4693]: I1204 10:30:55.108201 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv2tf\" (UniqueName: \"kubernetes.io/projected/ba3e6d3f-0285-4742-80ba-4c15da05164c-kube-api-access-gv2tf\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:55 crc kubenswrapper[4693]: I1204 10:30:55.241601 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:30:55 crc kubenswrapper[4693]: I1204 10:30:55.795712 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5"] Dec 04 10:30:56 crc kubenswrapper[4693]: I1204 10:30:56.822591 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" event={"ID":"ba3e6d3f-0285-4742-80ba-4c15da05164c","Type":"ContainerStarted","Data":"c62d4254971b276c719b44f6559800a4c378c3067cd11f679cda50eb64463411"} Dec 04 10:30:56 crc kubenswrapper[4693]: I1204 10:30:56.823342 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" event={"ID":"ba3e6d3f-0285-4742-80ba-4c15da05164c","Type":"ContainerStarted","Data":"59dcb080b292209467b58e35c50f5e25b34c43e6a38c080da1c46607e5607346"} Dec 04 10:30:56 crc kubenswrapper[4693]: I1204 10:30:56.853385 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" podStartSLOduration=2.470014531 podStartE2EDuration="2.853316678s" podCreationTimestamp="2025-12-04 10:30:54 +0000 UTC" firstStartedPulling="2025-12-04 10:30:55.805904111 +0000 UTC m=+2901.703497864" lastFinishedPulling="2025-12-04 10:30:56.189206258 +0000 UTC m=+2902.086800011" observedRunningTime="2025-12-04 10:30:56.844877592 +0000 UTC m=+2902.742471345" watchObservedRunningTime="2025-12-04 10:30:56.853316678 +0000 UTC m=+2902.750910431" Dec 04 10:31:22 crc kubenswrapper[4693]: I1204 10:31:22.272797 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:31:22 crc kubenswrapper[4693]: I1204 10:31:22.273272 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:31:22 crc kubenswrapper[4693]: I1204 10:31:22.273316 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 10:31:22 crc kubenswrapper[4693]: I1204 10:31:22.274053 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4ad1b467ce33fe682c23baf00fcb3108b9169ff6537845f6de6d424430c256b9"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:31:22 crc kubenswrapper[4693]: I1204 10:31:22.274099 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://4ad1b467ce33fe682c23baf00fcb3108b9169ff6537845f6de6d424430c256b9" gracePeriod=600 Dec 04 10:31:23 crc kubenswrapper[4693]: I1204 10:31:23.054740 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="4ad1b467ce33fe682c23baf00fcb3108b9169ff6537845f6de6d424430c256b9" exitCode=0 Dec 04 10:31:23 crc kubenswrapper[4693]: I1204 10:31:23.054817 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"4ad1b467ce33fe682c23baf00fcb3108b9169ff6537845f6de6d424430c256b9"} Dec 04 10:31:23 crc kubenswrapper[4693]: I1204 10:31:23.055359 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4"} Dec 04 10:31:23 crc kubenswrapper[4693]: I1204 10:31:23.055384 4693 scope.go:117] "RemoveContainer" containerID="fa9070738112462f65a81d8d233a4468d7c3bf827c80f7fe63e66e00fe161c5b" Dec 04 10:32:56 crc kubenswrapper[4693]: I1204 10:32:56.535912 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dtk2x"] Dec 04 10:32:56 crc kubenswrapper[4693]: I1204 10:32:56.539091 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtk2x" Dec 04 10:32:56 crc kubenswrapper[4693]: I1204 10:32:56.555840 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dtk2x"] Dec 04 10:32:56 crc kubenswrapper[4693]: I1204 10:32:56.612239 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d68018-1793-4439-95c7-10560c065834-catalog-content\") pod \"certified-operators-dtk2x\" (UID: \"94d68018-1793-4439-95c7-10560c065834\") " pod="openshift-marketplace/certified-operators-dtk2x" Dec 04 10:32:56 crc kubenswrapper[4693]: I1204 10:32:56.612396 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw7bg\" (UniqueName: \"kubernetes.io/projected/94d68018-1793-4439-95c7-10560c065834-kube-api-access-hw7bg\") pod \"certified-operators-dtk2x\" (UID: \"94d68018-1793-4439-95c7-10560c065834\") " pod="openshift-marketplace/certified-operators-dtk2x" Dec 04 10:32:56 crc kubenswrapper[4693]: I1204 10:32:56.612466 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d68018-1793-4439-95c7-10560c065834-utilities\") pod \"certified-operators-dtk2x\" (UID: \"94d68018-1793-4439-95c7-10560c065834\") " pod="openshift-marketplace/certified-operators-dtk2x" Dec 04 10:32:56 crc kubenswrapper[4693]: I1204 10:32:56.714247 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d68018-1793-4439-95c7-10560c065834-utilities\") pod \"certified-operators-dtk2x\" (UID: \"94d68018-1793-4439-95c7-10560c065834\") " pod="openshift-marketplace/certified-operators-dtk2x" Dec 04 10:32:56 crc kubenswrapper[4693]: I1204 10:32:56.714819 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d68018-1793-4439-95c7-10560c065834-utilities\") pod \"certified-operators-dtk2x\" (UID: \"94d68018-1793-4439-95c7-10560c065834\") " pod="openshift-marketplace/certified-operators-dtk2x" Dec 04 10:32:56 crc kubenswrapper[4693]: I1204 10:32:56.714819 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d68018-1793-4439-95c7-10560c065834-catalog-content\") pod \"certified-operators-dtk2x\" (UID: \"94d68018-1793-4439-95c7-10560c065834\") " pod="openshift-marketplace/certified-operators-dtk2x" Dec 04 10:32:56 crc kubenswrapper[4693]: I1204 10:32:56.714971 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw7bg\" (UniqueName: \"kubernetes.io/projected/94d68018-1793-4439-95c7-10560c065834-kube-api-access-hw7bg\") pod \"certified-operators-dtk2x\" (UID: \"94d68018-1793-4439-95c7-10560c065834\") " pod="openshift-marketplace/certified-operators-dtk2x" Dec 04 10:32:56 crc kubenswrapper[4693]: I1204 10:32:56.715268 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d68018-1793-4439-95c7-10560c065834-catalog-content\") pod \"certified-operators-dtk2x\" (UID: \"94d68018-1793-4439-95c7-10560c065834\") " pod="openshift-marketplace/certified-operators-dtk2x" Dec 04 10:32:56 crc kubenswrapper[4693]: I1204 10:32:56.737101 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw7bg\" (UniqueName: \"kubernetes.io/projected/94d68018-1793-4439-95c7-10560c065834-kube-api-access-hw7bg\") pod \"certified-operators-dtk2x\" (UID: \"94d68018-1793-4439-95c7-10560c065834\") " pod="openshift-marketplace/certified-operators-dtk2x" Dec 04 10:32:56 crc kubenswrapper[4693]: I1204 10:32:56.889416 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtk2x" Dec 04 10:32:57 crc kubenswrapper[4693]: I1204 10:32:57.413854 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dtk2x"] Dec 04 10:32:57 crc kubenswrapper[4693]: I1204 10:32:57.916854 4693 generic.go:334] "Generic (PLEG): container finished" podID="94d68018-1793-4439-95c7-10560c065834" containerID="d0bafc17e4bbe6497c7e4851e5f94d439a37f80c4e430ad690712f432db0362a" exitCode=0 Dec 04 10:32:57 crc kubenswrapper[4693]: I1204 10:32:57.917364 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtk2x" event={"ID":"94d68018-1793-4439-95c7-10560c065834","Type":"ContainerDied","Data":"d0bafc17e4bbe6497c7e4851e5f94d439a37f80c4e430ad690712f432db0362a"} Dec 04 10:32:57 crc kubenswrapper[4693]: I1204 10:32:57.917455 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtk2x" event={"ID":"94d68018-1793-4439-95c7-10560c065834","Type":"ContainerStarted","Data":"15287ae3155af12f44184791fefeb4d048e5ca863e42f5820d7f6f209bf0e9a7"} Dec 04 10:32:58 crc kubenswrapper[4693]: I1204 10:32:58.927511 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtk2x" event={"ID":"94d68018-1793-4439-95c7-10560c065834","Type":"ContainerStarted","Data":"8b3523547ce604ce56890eb0a12155c5d7d0831e8c55e128de3cd1c708402353"} Dec 04 10:32:59 crc kubenswrapper[4693]: I1204 10:32:59.939789 4693 generic.go:334] "Generic (PLEG): container finished" podID="94d68018-1793-4439-95c7-10560c065834" containerID="8b3523547ce604ce56890eb0a12155c5d7d0831e8c55e128de3cd1c708402353" exitCode=0 Dec 04 10:32:59 crc kubenswrapper[4693]: I1204 10:32:59.940056 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtk2x" event={"ID":"94d68018-1793-4439-95c7-10560c065834","Type":"ContainerDied","Data":"8b3523547ce604ce56890eb0a12155c5d7d0831e8c55e128de3cd1c708402353"} Dec 04 10:33:00 crc kubenswrapper[4693]: I1204 10:33:00.730491 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nn8x5"] Dec 04 10:33:00 crc kubenswrapper[4693]: I1204 10:33:00.736208 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nn8x5" Dec 04 10:33:00 crc kubenswrapper[4693]: I1204 10:33:00.739060 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nn8x5"] Dec 04 10:33:00 crc kubenswrapper[4693]: I1204 10:33:00.800288 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlnbq\" (UniqueName: \"kubernetes.io/projected/f7145f1c-cdcf-4e81-ba0c-9201ddb7160a-kube-api-access-hlnbq\") pod \"redhat-marketplace-nn8x5\" (UID: \"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a\") " pod="openshift-marketplace/redhat-marketplace-nn8x5" Dec 04 10:33:00 crc kubenswrapper[4693]: I1204 10:33:00.800412 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7145f1c-cdcf-4e81-ba0c-9201ddb7160a-catalog-content\") pod \"redhat-marketplace-nn8x5\" (UID: \"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a\") " pod="openshift-marketplace/redhat-marketplace-nn8x5" Dec 04 10:33:00 crc kubenswrapper[4693]: I1204 10:33:00.800435 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7145f1c-cdcf-4e81-ba0c-9201ddb7160a-utilities\") pod \"redhat-marketplace-nn8x5\" (UID: \"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a\") " pod="openshift-marketplace/redhat-marketplace-nn8x5" Dec 04 10:33:00 crc kubenswrapper[4693]: I1204 10:33:00.901359 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7145f1c-cdcf-4e81-ba0c-9201ddb7160a-catalog-content\") pod \"redhat-marketplace-nn8x5\" (UID: \"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a\") " pod="openshift-marketplace/redhat-marketplace-nn8x5" Dec 04 10:33:00 crc kubenswrapper[4693]: I1204 10:33:00.901406 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7145f1c-cdcf-4e81-ba0c-9201ddb7160a-utilities\") pod \"redhat-marketplace-nn8x5\" (UID: \"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a\") " pod="openshift-marketplace/redhat-marketplace-nn8x5" Dec 04 10:33:00 crc kubenswrapper[4693]: I1204 10:33:00.901537 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlnbq\" (UniqueName: \"kubernetes.io/projected/f7145f1c-cdcf-4e81-ba0c-9201ddb7160a-kube-api-access-hlnbq\") pod \"redhat-marketplace-nn8x5\" (UID: \"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a\") " pod="openshift-marketplace/redhat-marketplace-nn8x5" Dec 04 10:33:00 crc kubenswrapper[4693]: I1204 10:33:00.901979 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7145f1c-cdcf-4e81-ba0c-9201ddb7160a-catalog-content\") pod \"redhat-marketplace-nn8x5\" (UID: \"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a\") " pod="openshift-marketplace/redhat-marketplace-nn8x5" Dec 04 10:33:00 crc kubenswrapper[4693]: I1204 10:33:00.902051 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7145f1c-cdcf-4e81-ba0c-9201ddb7160a-utilities\") pod \"redhat-marketplace-nn8x5\" (UID: \"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a\") " pod="openshift-marketplace/redhat-marketplace-nn8x5" Dec 04 10:33:00 crc kubenswrapper[4693]: I1204 10:33:00.931857 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlnbq\" (UniqueName: \"kubernetes.io/projected/f7145f1c-cdcf-4e81-ba0c-9201ddb7160a-kube-api-access-hlnbq\") pod \"redhat-marketplace-nn8x5\" (UID: \"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a\") " pod="openshift-marketplace/redhat-marketplace-nn8x5" Dec 04 10:33:00 crc kubenswrapper[4693]: I1204 10:33:00.952519 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtk2x" event={"ID":"94d68018-1793-4439-95c7-10560c065834","Type":"ContainerStarted","Data":"9bd41833c1303f953ecf663035045a35d410cc8cd8f16be9dc25afe0a428d171"} Dec 04 10:33:00 crc kubenswrapper[4693]: I1204 10:33:00.981489 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dtk2x" podStartSLOduration=2.3768950220000002 podStartE2EDuration="4.981469447s" podCreationTimestamp="2025-12-04 10:32:56 +0000 UTC" firstStartedPulling="2025-12-04 10:32:57.919081081 +0000 UTC m=+3023.816674834" lastFinishedPulling="2025-12-04 10:33:00.523655496 +0000 UTC m=+3026.421249259" observedRunningTime="2025-12-04 10:33:00.977012888 +0000 UTC m=+3026.874606651" watchObservedRunningTime="2025-12-04 10:33:00.981469447 +0000 UTC m=+3026.879063200" Dec 04 10:33:01 crc kubenswrapper[4693]: I1204 10:33:01.115043 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nn8x5" Dec 04 10:33:01 crc kubenswrapper[4693]: I1204 10:33:01.614590 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nn8x5"] Dec 04 10:33:01 crc kubenswrapper[4693]: I1204 10:33:01.963048 4693 generic.go:334] "Generic (PLEG): container finished" podID="f7145f1c-cdcf-4e81-ba0c-9201ddb7160a" containerID="9ef0ea962d3859c17b0ee5f667ba767c8c8f0a0cbdffd420380f51ff83ab9b44" exitCode=0 Dec 04 10:33:01 crc kubenswrapper[4693]: I1204 10:33:01.963174 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nn8x5" event={"ID":"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a","Type":"ContainerDied","Data":"9ef0ea962d3859c17b0ee5f667ba767c8c8f0a0cbdffd420380f51ff83ab9b44"} Dec 04 10:33:01 crc kubenswrapper[4693]: I1204 10:33:01.963202 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nn8x5" event={"ID":"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a","Type":"ContainerStarted","Data":"c5f43d39e9ef6fd0661d53e875a3ab54298c7b72577eb53396184a8bee28b962"} Dec 04 10:33:02 crc kubenswrapper[4693]: I1204 10:33:02.973559 4693 generic.go:334] "Generic (PLEG): container finished" podID="f7145f1c-cdcf-4e81-ba0c-9201ddb7160a" containerID="7c6dc0e74fe11aa5c87e6f0187f0b4305828e4ac0af9cd44740330233f5ebf05" exitCode=0 Dec 04 10:33:02 crc kubenswrapper[4693]: I1204 10:33:02.973656 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nn8x5" event={"ID":"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a","Type":"ContainerDied","Data":"7c6dc0e74fe11aa5c87e6f0187f0b4305828e4ac0af9cd44740330233f5ebf05"} Dec 04 10:33:03 crc kubenswrapper[4693]: I1204 10:33:03.987907 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nn8x5" event={"ID":"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a","Type":"ContainerStarted","Data":"5e05efef6d76a0d47e2ac55d623bfd56a6f9079541fd366133119f32dfbd619e"} Dec 04 10:33:04 crc kubenswrapper[4693]: I1204 10:33:04.022893 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nn8x5" podStartSLOduration=2.518111325 podStartE2EDuration="4.02286581s" podCreationTimestamp="2025-12-04 10:33:00 +0000 UTC" firstStartedPulling="2025-12-04 10:33:01.965046986 +0000 UTC m=+3027.862640739" lastFinishedPulling="2025-12-04 10:33:03.469801471 +0000 UTC m=+3029.367395224" observedRunningTime="2025-12-04 10:33:04.016401737 +0000 UTC m=+3029.913995490" watchObservedRunningTime="2025-12-04 10:33:04.02286581 +0000 UTC m=+3029.920459563" Dec 04 10:33:06 crc kubenswrapper[4693]: I1204 10:33:06.864582 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dtk2x" Dec 04 10:33:06 crc kubenswrapper[4693]: I1204 10:33:06.890542 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dtk2x" Dec 04 10:33:06 crc kubenswrapper[4693]: I1204 10:33:06.912796 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dtk2x" Dec 04 10:33:07 crc kubenswrapper[4693]: I1204 10:33:07.059121 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dtk2x" Dec 04 10:33:08 crc kubenswrapper[4693]: I1204 10:33:08.322726 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dtk2x"] Dec 04 10:33:10 crc kubenswrapper[4693]: I1204 10:33:10.039442 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dtk2x" podUID="94d68018-1793-4439-95c7-10560c065834" containerName="registry-server" containerID="cri-o://9bd41833c1303f953ecf663035045a35d410cc8cd8f16be9dc25afe0a428d171" gracePeriod=2 Dec 04 10:33:10 crc kubenswrapper[4693]: I1204 10:33:10.526685 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtk2x" Dec 04 10:33:10 crc kubenswrapper[4693]: I1204 10:33:10.717324 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw7bg\" (UniqueName: \"kubernetes.io/projected/94d68018-1793-4439-95c7-10560c065834-kube-api-access-hw7bg\") pod \"94d68018-1793-4439-95c7-10560c065834\" (UID: \"94d68018-1793-4439-95c7-10560c065834\") " Dec 04 10:33:10 crc kubenswrapper[4693]: I1204 10:33:10.717474 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d68018-1793-4439-95c7-10560c065834-utilities\") pod \"94d68018-1793-4439-95c7-10560c065834\" (UID: \"94d68018-1793-4439-95c7-10560c065834\") " Dec 04 10:33:10 crc kubenswrapper[4693]: I1204 10:33:10.717683 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d68018-1793-4439-95c7-10560c065834-catalog-content\") pod \"94d68018-1793-4439-95c7-10560c065834\" (UID: \"94d68018-1793-4439-95c7-10560c065834\") " Dec 04 10:33:10 crc kubenswrapper[4693]: I1204 10:33:10.719083 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94d68018-1793-4439-95c7-10560c065834-utilities" (OuterVolumeSpecName: "utilities") pod "94d68018-1793-4439-95c7-10560c065834" (UID: "94d68018-1793-4439-95c7-10560c065834"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:33:10 crc kubenswrapper[4693]: I1204 10:33:10.723816 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94d68018-1793-4439-95c7-10560c065834-kube-api-access-hw7bg" (OuterVolumeSpecName: "kube-api-access-hw7bg") pod "94d68018-1793-4439-95c7-10560c065834" (UID: "94d68018-1793-4439-95c7-10560c065834"). InnerVolumeSpecName "kube-api-access-hw7bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:10 crc kubenswrapper[4693]: I1204 10:33:10.765760 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94d68018-1793-4439-95c7-10560c065834-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94d68018-1793-4439-95c7-10560c065834" (UID: "94d68018-1793-4439-95c7-10560c065834"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:33:10 crc kubenswrapper[4693]: I1204 10:33:10.819839 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d68018-1793-4439-95c7-10560c065834-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:10 crc kubenswrapper[4693]: I1204 10:33:10.819879 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d68018-1793-4439-95c7-10560c065834-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:10 crc kubenswrapper[4693]: I1204 10:33:10.819891 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw7bg\" (UniqueName: \"kubernetes.io/projected/94d68018-1793-4439-95c7-10560c065834-kube-api-access-hw7bg\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:11 crc kubenswrapper[4693]: I1204 10:33:11.050820 4693 generic.go:334] "Generic (PLEG): container finished" podID="94d68018-1793-4439-95c7-10560c065834" containerID="9bd41833c1303f953ecf663035045a35d410cc8cd8f16be9dc25afe0a428d171" exitCode=0 Dec 04 10:33:11 crc kubenswrapper[4693]: I1204 10:33:11.050879 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtk2x" event={"ID":"94d68018-1793-4439-95c7-10560c065834","Type":"ContainerDied","Data":"9bd41833c1303f953ecf663035045a35d410cc8cd8f16be9dc25afe0a428d171"} Dec 04 10:33:11 crc kubenswrapper[4693]: I1204 10:33:11.051136 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtk2x" event={"ID":"94d68018-1793-4439-95c7-10560c065834","Type":"ContainerDied","Data":"15287ae3155af12f44184791fefeb4d048e5ca863e42f5820d7f6f209bf0e9a7"} Dec 04 10:33:11 crc kubenswrapper[4693]: I1204 10:33:11.051160 4693 scope.go:117] "RemoveContainer" containerID="9bd41833c1303f953ecf663035045a35d410cc8cd8f16be9dc25afe0a428d171" Dec 04 10:33:11 crc kubenswrapper[4693]: I1204 10:33:11.050905 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtk2x" Dec 04 10:33:11 crc kubenswrapper[4693]: I1204 10:33:11.074453 4693 scope.go:117] "RemoveContainer" containerID="8b3523547ce604ce56890eb0a12155c5d7d0831e8c55e128de3cd1c708402353" Dec 04 10:33:11 crc kubenswrapper[4693]: I1204 10:33:11.085588 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dtk2x"] Dec 04 10:33:11 crc kubenswrapper[4693]: I1204 10:33:11.097022 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dtk2x"] Dec 04 10:33:11 crc kubenswrapper[4693]: I1204 10:33:11.111596 4693 scope.go:117] "RemoveContainer" containerID="d0bafc17e4bbe6497c7e4851e5f94d439a37f80c4e430ad690712f432db0362a" Dec 04 10:33:11 crc kubenswrapper[4693]: I1204 10:33:11.115763 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nn8x5" Dec 04 10:33:11 crc kubenswrapper[4693]: I1204 10:33:11.115807 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nn8x5" Dec 04 10:33:11 crc kubenswrapper[4693]: I1204 10:33:11.147289 4693 scope.go:117] "RemoveContainer" containerID="9bd41833c1303f953ecf663035045a35d410cc8cd8f16be9dc25afe0a428d171" Dec 04 10:33:11 crc kubenswrapper[4693]: E1204 10:33:11.147698 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd41833c1303f953ecf663035045a35d410cc8cd8f16be9dc25afe0a428d171\": container with ID starting with 9bd41833c1303f953ecf663035045a35d410cc8cd8f16be9dc25afe0a428d171 not found: ID does not exist" containerID="9bd41833c1303f953ecf663035045a35d410cc8cd8f16be9dc25afe0a428d171" Dec 04 10:33:11 crc kubenswrapper[4693]: I1204 10:33:11.147756 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd41833c1303f953ecf663035045a35d410cc8cd8f16be9dc25afe0a428d171"} err="failed to get container status \"9bd41833c1303f953ecf663035045a35d410cc8cd8f16be9dc25afe0a428d171\": rpc error: code = NotFound desc = could not find container \"9bd41833c1303f953ecf663035045a35d410cc8cd8f16be9dc25afe0a428d171\": container with ID starting with 9bd41833c1303f953ecf663035045a35d410cc8cd8f16be9dc25afe0a428d171 not found: ID does not exist" Dec 04 10:33:11 crc kubenswrapper[4693]: I1204 10:33:11.147784 4693 scope.go:117] "RemoveContainer" containerID="8b3523547ce604ce56890eb0a12155c5d7d0831e8c55e128de3cd1c708402353" Dec 04 10:33:11 crc kubenswrapper[4693]: E1204 10:33:11.148041 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b3523547ce604ce56890eb0a12155c5d7d0831e8c55e128de3cd1c708402353\": container with ID starting with 8b3523547ce604ce56890eb0a12155c5d7d0831e8c55e128de3cd1c708402353 not found: ID does not exist" containerID="8b3523547ce604ce56890eb0a12155c5d7d0831e8c55e128de3cd1c708402353" Dec 04 10:33:11 crc kubenswrapper[4693]: I1204 10:33:11.148071 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b3523547ce604ce56890eb0a12155c5d7d0831e8c55e128de3cd1c708402353"} err="failed to get container status \"8b3523547ce604ce56890eb0a12155c5d7d0831e8c55e128de3cd1c708402353\": rpc error: code = NotFound desc = could not find container \"8b3523547ce604ce56890eb0a12155c5d7d0831e8c55e128de3cd1c708402353\": container with ID starting with 8b3523547ce604ce56890eb0a12155c5d7d0831e8c55e128de3cd1c708402353 not found: ID does not exist" Dec 04 10:33:11 crc kubenswrapper[4693]: I1204 10:33:11.148092 4693 scope.go:117] "RemoveContainer" containerID="d0bafc17e4bbe6497c7e4851e5f94d439a37f80c4e430ad690712f432db0362a" Dec 04 10:33:11 crc kubenswrapper[4693]: E1204 10:33:11.148251 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0bafc17e4bbe6497c7e4851e5f94d439a37f80c4e430ad690712f432db0362a\": container with ID starting with d0bafc17e4bbe6497c7e4851e5f94d439a37f80c4e430ad690712f432db0362a not found: ID does not exist" containerID="d0bafc17e4bbe6497c7e4851e5f94d439a37f80c4e430ad690712f432db0362a" Dec 04 10:33:11 crc kubenswrapper[4693]: I1204 10:33:11.148269 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0bafc17e4bbe6497c7e4851e5f94d439a37f80c4e430ad690712f432db0362a"} err="failed to get container status \"d0bafc17e4bbe6497c7e4851e5f94d439a37f80c4e430ad690712f432db0362a\": rpc error: code = NotFound desc = could not find container \"d0bafc17e4bbe6497c7e4851e5f94d439a37f80c4e430ad690712f432db0362a\": container with ID starting with d0bafc17e4bbe6497c7e4851e5f94d439a37f80c4e430ad690712f432db0362a not found: ID does not exist" Dec 04 10:33:11 crc kubenswrapper[4693]: I1204 10:33:11.168066 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nn8x5" Dec 04 10:33:12 crc kubenswrapper[4693]: I1204 10:33:12.113911 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nn8x5" Dec 04 10:33:12 crc kubenswrapper[4693]: I1204 10:33:12.472899 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94d68018-1793-4439-95c7-10560c065834" path="/var/lib/kubelet/pods/94d68018-1793-4439-95c7-10560c065834/volumes" Dec 04 10:33:13 crc kubenswrapper[4693]: I1204 10:33:13.528522 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nn8x5"] Dec 04 10:33:14 crc kubenswrapper[4693]: I1204 10:33:14.079557 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nn8x5" podUID="f7145f1c-cdcf-4e81-ba0c-9201ddb7160a" containerName="registry-server" containerID="cri-o://5e05efef6d76a0d47e2ac55d623bfd56a6f9079541fd366133119f32dfbd619e" gracePeriod=2 Dec 04 10:33:14 crc kubenswrapper[4693]: I1204 10:33:14.549734 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nn8x5" Dec 04 10:33:14 crc kubenswrapper[4693]: I1204 10:33:14.718360 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlnbq\" (UniqueName: \"kubernetes.io/projected/f7145f1c-cdcf-4e81-ba0c-9201ddb7160a-kube-api-access-hlnbq\") pod \"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a\" (UID: \"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a\") " Dec 04 10:33:14 crc kubenswrapper[4693]: I1204 10:33:14.718440 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7145f1c-cdcf-4e81-ba0c-9201ddb7160a-catalog-content\") pod \"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a\" (UID: \"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a\") " Dec 04 10:33:14 crc kubenswrapper[4693]: I1204 10:33:14.718515 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7145f1c-cdcf-4e81-ba0c-9201ddb7160a-utilities\") pod \"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a\" (UID: \"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a\") " Dec 04 10:33:14 crc kubenswrapper[4693]: I1204 10:33:14.719466 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7145f1c-cdcf-4e81-ba0c-9201ddb7160a-utilities" (OuterVolumeSpecName: "utilities") pod "f7145f1c-cdcf-4e81-ba0c-9201ddb7160a" (UID: "f7145f1c-cdcf-4e81-ba0c-9201ddb7160a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:33:14 crc kubenswrapper[4693]: I1204 10:33:14.723543 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7145f1c-cdcf-4e81-ba0c-9201ddb7160a-kube-api-access-hlnbq" (OuterVolumeSpecName: "kube-api-access-hlnbq") pod "f7145f1c-cdcf-4e81-ba0c-9201ddb7160a" (UID: "f7145f1c-cdcf-4e81-ba0c-9201ddb7160a"). InnerVolumeSpecName "kube-api-access-hlnbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:14 crc kubenswrapper[4693]: I1204 10:33:14.737827 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7145f1c-cdcf-4e81-ba0c-9201ddb7160a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7145f1c-cdcf-4e81-ba0c-9201ddb7160a" (UID: "f7145f1c-cdcf-4e81-ba0c-9201ddb7160a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:33:14 crc kubenswrapper[4693]: I1204 10:33:14.821580 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlnbq\" (UniqueName: \"kubernetes.io/projected/f7145f1c-cdcf-4e81-ba0c-9201ddb7160a-kube-api-access-hlnbq\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:14 crc kubenswrapper[4693]: I1204 10:33:14.821617 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7145f1c-cdcf-4e81-ba0c-9201ddb7160a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:14 crc kubenswrapper[4693]: I1204 10:33:14.821626 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7145f1c-cdcf-4e81-ba0c-9201ddb7160a-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:15 crc kubenswrapper[4693]: I1204 10:33:15.094681 4693 generic.go:334] "Generic (PLEG): container finished" podID="f7145f1c-cdcf-4e81-ba0c-9201ddb7160a" containerID="5e05efef6d76a0d47e2ac55d623bfd56a6f9079541fd366133119f32dfbd619e" exitCode=0 Dec 04 10:33:15 crc kubenswrapper[4693]: I1204 10:33:15.094754 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nn8x5" Dec 04 10:33:15 crc kubenswrapper[4693]: I1204 10:33:15.094767 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nn8x5" event={"ID":"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a","Type":"ContainerDied","Data":"5e05efef6d76a0d47e2ac55d623bfd56a6f9079541fd366133119f32dfbd619e"} Dec 04 10:33:15 crc kubenswrapper[4693]: I1204 10:33:15.095112 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nn8x5" event={"ID":"f7145f1c-cdcf-4e81-ba0c-9201ddb7160a","Type":"ContainerDied","Data":"c5f43d39e9ef6fd0661d53e875a3ab54298c7b72577eb53396184a8bee28b962"} Dec 04 10:33:15 crc kubenswrapper[4693]: I1204 10:33:15.095131 4693 scope.go:117] "RemoveContainer" containerID="5e05efef6d76a0d47e2ac55d623bfd56a6f9079541fd366133119f32dfbd619e" Dec 04 10:33:15 crc kubenswrapper[4693]: I1204 10:33:15.126949 4693 scope.go:117] "RemoveContainer" containerID="7c6dc0e74fe11aa5c87e6f0187f0b4305828e4ac0af9cd44740330233f5ebf05" Dec 04 10:33:15 crc kubenswrapper[4693]: I1204 10:33:15.136743 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nn8x5"] Dec 04 10:33:15 crc kubenswrapper[4693]: I1204 10:33:15.145948 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nn8x5"] Dec 04 10:33:15 crc kubenswrapper[4693]: I1204 10:33:15.158428 4693 scope.go:117] "RemoveContainer" containerID="9ef0ea962d3859c17b0ee5f667ba767c8c8f0a0cbdffd420380f51ff83ab9b44" Dec 04 10:33:15 crc kubenswrapper[4693]: I1204 10:33:15.202850 4693 scope.go:117] "RemoveContainer" containerID="5e05efef6d76a0d47e2ac55d623bfd56a6f9079541fd366133119f32dfbd619e" Dec 04 10:33:15 crc kubenswrapper[4693]: E1204 10:33:15.203236 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e05efef6d76a0d47e2ac55d623bfd56a6f9079541fd366133119f32dfbd619e\": container with ID starting with 5e05efef6d76a0d47e2ac55d623bfd56a6f9079541fd366133119f32dfbd619e not found: ID does not exist" containerID="5e05efef6d76a0d47e2ac55d623bfd56a6f9079541fd366133119f32dfbd619e" Dec 04 10:33:15 crc kubenswrapper[4693]: I1204 10:33:15.203280 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e05efef6d76a0d47e2ac55d623bfd56a6f9079541fd366133119f32dfbd619e"} err="failed to get container status \"5e05efef6d76a0d47e2ac55d623bfd56a6f9079541fd366133119f32dfbd619e\": rpc error: code = NotFound desc = could not find container \"5e05efef6d76a0d47e2ac55d623bfd56a6f9079541fd366133119f32dfbd619e\": container with ID starting with 5e05efef6d76a0d47e2ac55d623bfd56a6f9079541fd366133119f32dfbd619e not found: ID does not exist" Dec 04 10:33:15 crc kubenswrapper[4693]: I1204 10:33:15.203309 4693 scope.go:117] "RemoveContainer" containerID="7c6dc0e74fe11aa5c87e6f0187f0b4305828e4ac0af9cd44740330233f5ebf05" Dec 04 10:33:15 crc kubenswrapper[4693]: E1204 10:33:15.203689 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c6dc0e74fe11aa5c87e6f0187f0b4305828e4ac0af9cd44740330233f5ebf05\": container with ID starting with 7c6dc0e74fe11aa5c87e6f0187f0b4305828e4ac0af9cd44740330233f5ebf05 not found: ID does not exist" containerID="7c6dc0e74fe11aa5c87e6f0187f0b4305828e4ac0af9cd44740330233f5ebf05" Dec 04 10:33:15 crc kubenswrapper[4693]: I1204 10:33:15.203724 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c6dc0e74fe11aa5c87e6f0187f0b4305828e4ac0af9cd44740330233f5ebf05"} err="failed to get container status \"7c6dc0e74fe11aa5c87e6f0187f0b4305828e4ac0af9cd44740330233f5ebf05\": rpc error: code = NotFound desc = could not find container \"7c6dc0e74fe11aa5c87e6f0187f0b4305828e4ac0af9cd44740330233f5ebf05\": container with ID starting with 7c6dc0e74fe11aa5c87e6f0187f0b4305828e4ac0af9cd44740330233f5ebf05 not found: ID does not exist" Dec 04 10:33:15 crc kubenswrapper[4693]: I1204 10:33:15.203742 4693 scope.go:117] "RemoveContainer" containerID="9ef0ea962d3859c17b0ee5f667ba767c8c8f0a0cbdffd420380f51ff83ab9b44" Dec 04 10:33:15 crc kubenswrapper[4693]: E1204 10:33:15.203991 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ef0ea962d3859c17b0ee5f667ba767c8c8f0a0cbdffd420380f51ff83ab9b44\": container with ID starting with 9ef0ea962d3859c17b0ee5f667ba767c8c8f0a0cbdffd420380f51ff83ab9b44 not found: ID does not exist" containerID="9ef0ea962d3859c17b0ee5f667ba767c8c8f0a0cbdffd420380f51ff83ab9b44" Dec 04 10:33:15 crc kubenswrapper[4693]: I1204 10:33:15.204024 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ef0ea962d3859c17b0ee5f667ba767c8c8f0a0cbdffd420380f51ff83ab9b44"} err="failed to get container status \"9ef0ea962d3859c17b0ee5f667ba767c8c8f0a0cbdffd420380f51ff83ab9b44\": rpc error: code = NotFound desc = could not find container \"9ef0ea962d3859c17b0ee5f667ba767c8c8f0a0cbdffd420380f51ff83ab9b44\": container with ID starting with 9ef0ea962d3859c17b0ee5f667ba767c8c8f0a0cbdffd420380f51ff83ab9b44 not found: ID does not exist" Dec 04 10:33:16 crc kubenswrapper[4693]: I1204 10:33:16.105078 4693 generic.go:334] "Generic (PLEG): container finished" podID="ba3e6d3f-0285-4742-80ba-4c15da05164c" containerID="c62d4254971b276c719b44f6559800a4c378c3067cd11f679cda50eb64463411" exitCode=0 Dec 04 10:33:16 crc kubenswrapper[4693]: I1204 10:33:16.105159 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" event={"ID":"ba3e6d3f-0285-4742-80ba-4c15da05164c","Type":"ContainerDied","Data":"c62d4254971b276c719b44f6559800a4c378c3067cd11f679cda50eb64463411"} Dec 04 10:33:16 crc kubenswrapper[4693]: I1204 10:33:16.471488 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7145f1c-cdcf-4e81-ba0c-9201ddb7160a" path="/var/lib/kubelet/pods/f7145f1c-cdcf-4e81-ba0c-9201ddb7160a/volumes" Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.547529 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.574614 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ceilometer-compute-config-data-0\") pod \"ba3e6d3f-0285-4742-80ba-4c15da05164c\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.574711 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ceilometer-compute-config-data-1\") pod \"ba3e6d3f-0285-4742-80ba-4c15da05164c\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.574782 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ssh-key\") pod \"ba3e6d3f-0285-4742-80ba-4c15da05164c\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.574897 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-inventory\") pod \"ba3e6d3f-0285-4742-80ba-4c15da05164c\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.574947 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ceilometer-compute-config-data-2\") pod \"ba3e6d3f-0285-4742-80ba-4c15da05164c\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.575012 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv2tf\" (UniqueName: \"kubernetes.io/projected/ba3e6d3f-0285-4742-80ba-4c15da05164c-kube-api-access-gv2tf\") pod \"ba3e6d3f-0285-4742-80ba-4c15da05164c\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.575135 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-telemetry-combined-ca-bundle\") pod \"ba3e6d3f-0285-4742-80ba-4c15da05164c\" (UID: \"ba3e6d3f-0285-4742-80ba-4c15da05164c\") " Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.595299 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba3e6d3f-0285-4742-80ba-4c15da05164c-kube-api-access-gv2tf" (OuterVolumeSpecName: "kube-api-access-gv2tf") pod "ba3e6d3f-0285-4742-80ba-4c15da05164c" (UID: "ba3e6d3f-0285-4742-80ba-4c15da05164c"). InnerVolumeSpecName "kube-api-access-gv2tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.595896 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "ba3e6d3f-0285-4742-80ba-4c15da05164c" (UID: "ba3e6d3f-0285-4742-80ba-4c15da05164c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.604088 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "ba3e6d3f-0285-4742-80ba-4c15da05164c" (UID: "ba3e6d3f-0285-4742-80ba-4c15da05164c"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.615366 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-inventory" (OuterVolumeSpecName: "inventory") pod "ba3e6d3f-0285-4742-80ba-4c15da05164c" (UID: "ba3e6d3f-0285-4742-80ba-4c15da05164c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.615934 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "ba3e6d3f-0285-4742-80ba-4c15da05164c" (UID: "ba3e6d3f-0285-4742-80ba-4c15da05164c"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.631542 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ba3e6d3f-0285-4742-80ba-4c15da05164c" (UID: "ba3e6d3f-0285-4742-80ba-4c15da05164c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.634065 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "ba3e6d3f-0285-4742-80ba-4c15da05164c" (UID: "ba3e6d3f-0285-4742-80ba-4c15da05164c"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.678034 4693 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.678073 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv2tf\" (UniqueName: \"kubernetes.io/projected/ba3e6d3f-0285-4742-80ba-4c15da05164c-kube-api-access-gv2tf\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.678088 4693 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.678097 4693 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.678107 4693 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.678116 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:17 crc kubenswrapper[4693]: I1204 10:33:17.678127 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba3e6d3f-0285-4742-80ba-4c15da05164c-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:18 crc kubenswrapper[4693]: I1204 10:33:18.126437 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" event={"ID":"ba3e6d3f-0285-4742-80ba-4c15da05164c","Type":"ContainerDied","Data":"59dcb080b292209467b58e35c50f5e25b34c43e6a38c080da1c46607e5607346"} Dec 04 10:33:18 crc kubenswrapper[4693]: I1204 10:33:18.126480 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59dcb080b292209467b58e35c50f5e25b34c43e6a38c080da1c46607e5607346" Dec 04 10:33:18 crc kubenswrapper[4693]: I1204 10:33:18.126510 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5" Dec 04 10:33:22 crc kubenswrapper[4693]: I1204 10:33:22.273376 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:33:22 crc kubenswrapper[4693]: I1204 10:33:22.273997 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:33:52 crc kubenswrapper[4693]: I1204 10:33:52.273363 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:33:52 crc kubenswrapper[4693]: I1204 10:33:52.275288 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.203502 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 04 10:34:10 crc kubenswrapper[4693]: E1204 10:34:10.204747 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d68018-1793-4439-95c7-10560c065834" containerName="extract-utilities" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.204767 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d68018-1793-4439-95c7-10560c065834" containerName="extract-utilities" Dec 04 10:34:10 crc kubenswrapper[4693]: E1204 10:34:10.204784 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7145f1c-cdcf-4e81-ba0c-9201ddb7160a" containerName="extract-content" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.204796 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7145f1c-cdcf-4e81-ba0c-9201ddb7160a" containerName="extract-content" Dec 04 10:34:10 crc kubenswrapper[4693]: E1204 10:34:10.204821 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d68018-1793-4439-95c7-10560c065834" containerName="registry-server" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.204829 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d68018-1793-4439-95c7-10560c065834" containerName="registry-server" Dec 04 10:34:10 crc kubenswrapper[4693]: E1204 10:34:10.204849 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d68018-1793-4439-95c7-10560c065834" containerName="extract-content" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.204857 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d68018-1793-4439-95c7-10560c065834" containerName="extract-content" Dec 04 10:34:10 crc kubenswrapper[4693]: E1204 10:34:10.204880 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7145f1c-cdcf-4e81-ba0c-9201ddb7160a" containerName="registry-server" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.204888 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7145f1c-cdcf-4e81-ba0c-9201ddb7160a" containerName="registry-server" Dec 04 10:34:10 crc kubenswrapper[4693]: E1204 10:34:10.204906 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3e6d3f-0285-4742-80ba-4c15da05164c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.204916 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3e6d3f-0285-4742-80ba-4c15da05164c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 04 10:34:10 crc kubenswrapper[4693]: E1204 10:34:10.204960 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7145f1c-cdcf-4e81-ba0c-9201ddb7160a" containerName="extract-utilities" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.204970 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7145f1c-cdcf-4e81-ba0c-9201ddb7160a" containerName="extract-utilities" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.205208 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="94d68018-1793-4439-95c7-10560c065834" containerName="registry-server" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.205232 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7145f1c-cdcf-4e81-ba0c-9201ddb7160a" containerName="registry-server" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.205247 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba3e6d3f-0285-4742-80ba-4c15da05164c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.206728 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.210040 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.210598 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.210797 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.222434 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.233298 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da1a36ac-5a8b-475d-8434-eb43b0f8a656-config-data\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.233395 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da1a36ac-5a8b-475d-8434-eb43b0f8a656-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.233428 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da1a36ac-5a8b-475d-8434-eb43b0f8a656-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.335838 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/da1a36ac-5a8b-475d-8434-eb43b0f8a656-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.336106 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da1a36ac-5a8b-475d-8434-eb43b0f8a656-config-data\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.336226 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da1a36ac-5a8b-475d-8434-eb43b0f8a656-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.336395 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da1a36ac-5a8b-475d-8434-eb43b0f8a656-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.336526 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da1a36ac-5a8b-475d-8434-eb43b0f8a656-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.336630 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.336730 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/da1a36ac-5a8b-475d-8434-eb43b0f8a656-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.336869 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/da1a36ac-5a8b-475d-8434-eb43b0f8a656-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.337153 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97fcp\" (UniqueName: \"kubernetes.io/projected/da1a36ac-5a8b-475d-8434-eb43b0f8a656-kube-api-access-97fcp\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.337425 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da1a36ac-5a8b-475d-8434-eb43b0f8a656-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.337622 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da1a36ac-5a8b-475d-8434-eb43b0f8a656-config-data\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.343833 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da1a36ac-5a8b-475d-8434-eb43b0f8a656-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.439115 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da1a36ac-5a8b-475d-8434-eb43b0f8a656-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.439255 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.439292 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/da1a36ac-5a8b-475d-8434-eb43b0f8a656-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.439355 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/da1a36ac-5a8b-475d-8434-eb43b0f8a656-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.439458 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97fcp\" (UniqueName: \"kubernetes.io/projected/da1a36ac-5a8b-475d-8434-eb43b0f8a656-kube-api-access-97fcp\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.439506 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/da1a36ac-5a8b-475d-8434-eb43b0f8a656-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.440184 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/da1a36ac-5a8b-475d-8434-eb43b0f8a656-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.440521 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.440628 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/da1a36ac-5a8b-475d-8434-eb43b0f8a656-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.442837 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da1a36ac-5a8b-475d-8434-eb43b0f8a656-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.444191 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/da1a36ac-5a8b-475d-8434-eb43b0f8a656-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.460347 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97fcp\" (UniqueName: \"kubernetes.io/projected/da1a36ac-5a8b-475d-8434-eb43b0f8a656-kube-api-access-97fcp\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.475460 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.529592 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.964371 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 04 10:34:10 crc kubenswrapper[4693]: I1204 10:34:10.970699 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:34:11 crc kubenswrapper[4693]: I1204 10:34:11.635128 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"da1a36ac-5a8b-475d-8434-eb43b0f8a656","Type":"ContainerStarted","Data":"bbb55b035441f891046b061192a148c683adbc49b60e4b515d4735944ee45b77"} Dec 04 10:34:22 crc kubenswrapper[4693]: I1204 10:34:22.272865 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:34:22 crc kubenswrapper[4693]: I1204 10:34:22.273539 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:34:22 crc kubenswrapper[4693]: I1204 10:34:22.273595 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 10:34:22 crc kubenswrapper[4693]: I1204 10:34:22.274423 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:34:22 crc kubenswrapper[4693]: I1204 10:34:22.274483 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" gracePeriod=600 Dec 04 10:34:22 crc kubenswrapper[4693]: I1204 10:34:22.749694 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" exitCode=0 Dec 04 10:34:22 crc kubenswrapper[4693]: I1204 10:34:22.749756 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4"} Dec 04 10:34:22 crc kubenswrapper[4693]: I1204 10:34:22.750064 4693 scope.go:117] "RemoveContainer" containerID="4ad1b467ce33fe682c23baf00fcb3108b9169ff6537845f6de6d424430c256b9" Dec 04 10:34:32 crc kubenswrapper[4693]: I1204 10:34:32.676059 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-phcsv"] Dec 04 10:34:32 crc kubenswrapper[4693]: I1204 10:34:32.679484 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phcsv" Dec 04 10:34:32 crc kubenswrapper[4693]: I1204 10:34:32.711314 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phcsv"] Dec 04 10:34:32 crc kubenswrapper[4693]: I1204 10:34:32.844690 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6afb197d-2f90-470c-b868-71d2b3aeb445-catalog-content\") pod \"redhat-operators-phcsv\" (UID: \"6afb197d-2f90-470c-b868-71d2b3aeb445\") " pod="openshift-marketplace/redhat-operators-phcsv" Dec 04 10:34:32 crc kubenswrapper[4693]: I1204 10:34:32.844768 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6afb197d-2f90-470c-b868-71d2b3aeb445-utilities\") pod \"redhat-operators-phcsv\" (UID: \"6afb197d-2f90-470c-b868-71d2b3aeb445\") " pod="openshift-marketplace/redhat-operators-phcsv" Dec 04 10:34:32 crc kubenswrapper[4693]: I1204 10:34:32.845070 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfcrw\" (UniqueName: \"kubernetes.io/projected/6afb197d-2f90-470c-b868-71d2b3aeb445-kube-api-access-sfcrw\") pod \"redhat-operators-phcsv\" (UID: \"6afb197d-2f90-470c-b868-71d2b3aeb445\") " pod="openshift-marketplace/redhat-operators-phcsv" Dec 04 10:34:32 crc kubenswrapper[4693]: I1204 10:34:32.946913 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6afb197d-2f90-470c-b868-71d2b3aeb445-utilities\") pod \"redhat-operators-phcsv\" (UID: \"6afb197d-2f90-470c-b868-71d2b3aeb445\") " pod="openshift-marketplace/redhat-operators-phcsv" Dec 04 10:34:32 crc kubenswrapper[4693]: I1204 10:34:32.947048 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfcrw\" (UniqueName: \"kubernetes.io/projected/6afb197d-2f90-470c-b868-71d2b3aeb445-kube-api-access-sfcrw\") pod \"redhat-operators-phcsv\" (UID: \"6afb197d-2f90-470c-b868-71d2b3aeb445\") " pod="openshift-marketplace/redhat-operators-phcsv" Dec 04 10:34:32 crc kubenswrapper[4693]: I1204 10:34:32.947222 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6afb197d-2f90-470c-b868-71d2b3aeb445-catalog-content\") pod \"redhat-operators-phcsv\" (UID: \"6afb197d-2f90-470c-b868-71d2b3aeb445\") " pod="openshift-marketplace/redhat-operators-phcsv" Dec 04 10:34:32 crc kubenswrapper[4693]: I1204 10:34:32.968423 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfcrw\" (UniqueName: \"kubernetes.io/projected/6afb197d-2f90-470c-b868-71d2b3aeb445-kube-api-access-sfcrw\") pod \"redhat-operators-phcsv\" (UID: \"6afb197d-2f90-470c-b868-71d2b3aeb445\") " pod="openshift-marketplace/redhat-operators-phcsv" Dec 04 10:34:33 crc kubenswrapper[4693]: I1204 10:34:33.116974 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6afb197d-2f90-470c-b868-71d2b3aeb445-utilities\") pod \"redhat-operators-phcsv\" (UID: \"6afb197d-2f90-470c-b868-71d2b3aeb445\") " pod="openshift-marketplace/redhat-operators-phcsv" Dec 04 10:34:33 crc kubenswrapper[4693]: I1204 10:34:33.117104 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6afb197d-2f90-470c-b868-71d2b3aeb445-catalog-content\") pod \"redhat-operators-phcsv\" (UID: \"6afb197d-2f90-470c-b868-71d2b3aeb445\") " pod="openshift-marketplace/redhat-operators-phcsv" Dec 04 10:34:33 crc kubenswrapper[4693]: I1204 10:34:33.314168 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phcsv" Dec 04 10:34:48 crc kubenswrapper[4693]: E1204 10:34:48.648353 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:34:48 crc kubenswrapper[4693]: E1204 10:34:48.726119 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 04 10:34:48 crc kubenswrapper[4693]: E1204 10:34:48.727072 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-97fcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(da1a36ac-5a8b-475d-8434-eb43b0f8a656): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:34:48 crc kubenswrapper[4693]: E1204 10:34:48.728434 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="da1a36ac-5a8b-475d-8434-eb43b0f8a656" Dec 04 10:34:48 crc kubenswrapper[4693]: I1204 10:34:48.993262 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:34:48 crc kubenswrapper[4693]: E1204 10:34:48.993746 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:34:48 crc kubenswrapper[4693]: E1204 10:34:48.996208 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="da1a36ac-5a8b-475d-8434-eb43b0f8a656" Dec 04 10:34:49 crc kubenswrapper[4693]: I1204 10:34:49.089891 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phcsv"] Dec 04 10:34:50 crc kubenswrapper[4693]: I1204 10:34:50.016467 4693 generic.go:334] "Generic (PLEG): container finished" podID="6afb197d-2f90-470c-b868-71d2b3aeb445" containerID="06da47316ddd92bda4748e5bf384bd29e60aedb3c760b07b7eec8f602e4ecb40" exitCode=0 Dec 04 10:34:50 crc kubenswrapper[4693]: I1204 10:34:50.016836 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phcsv" event={"ID":"6afb197d-2f90-470c-b868-71d2b3aeb445","Type":"ContainerDied","Data":"06da47316ddd92bda4748e5bf384bd29e60aedb3c760b07b7eec8f602e4ecb40"} Dec 04 10:34:50 crc kubenswrapper[4693]: I1204 10:34:50.016881 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phcsv" event={"ID":"6afb197d-2f90-470c-b868-71d2b3aeb445","Type":"ContainerStarted","Data":"aafeffefda35dd7e57dfe6969466795f52f5621567fdcf2a250295c70c0d2f27"} Dec 04 10:34:51 crc kubenswrapper[4693]: I1204 10:34:51.029445 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phcsv" event={"ID":"6afb197d-2f90-470c-b868-71d2b3aeb445","Type":"ContainerStarted","Data":"b781d934557443a1a0e14525512528119b41cfc533631034aea9aba4d0c3f27e"} Dec 04 10:34:55 crc kubenswrapper[4693]: E1204 10:34:55.338453 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6afb197d_2f90_470c_b868_71d2b3aeb445.slice/crio-b781d934557443a1a0e14525512528119b41cfc533631034aea9aba4d0c3f27e.scope\": RecentStats: unable to find data in memory cache]" Dec 04 10:34:57 crc kubenswrapper[4693]: I1204 10:34:57.085510 4693 generic.go:334] "Generic (PLEG): container finished" podID="6afb197d-2f90-470c-b868-71d2b3aeb445" containerID="b781d934557443a1a0e14525512528119b41cfc533631034aea9aba4d0c3f27e" exitCode=0 Dec 04 10:34:57 crc kubenswrapper[4693]: I1204 10:34:57.085595 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phcsv" event={"ID":"6afb197d-2f90-470c-b868-71d2b3aeb445","Type":"ContainerDied","Data":"b781d934557443a1a0e14525512528119b41cfc533631034aea9aba4d0c3f27e"} Dec 04 10:35:00 crc kubenswrapper[4693]: I1204 10:35:00.123961 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phcsv" event={"ID":"6afb197d-2f90-470c-b868-71d2b3aeb445","Type":"ContainerStarted","Data":"69b72f9e6b01b5bb47dab2b4deed495698fa9b131adb7434a03ace8f01cc6c2b"} Dec 04 10:35:00 crc kubenswrapper[4693]: I1204 10:35:00.145276 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-phcsv" podStartSLOduration=18.396384172 podStartE2EDuration="28.145255919s" podCreationTimestamp="2025-12-04 10:34:32 +0000 UTC" firstStartedPulling="2025-12-04 10:34:50.020437199 +0000 UTC m=+3135.918030952" lastFinishedPulling="2025-12-04 10:34:59.769308946 +0000 UTC m=+3145.666902699" observedRunningTime="2025-12-04 10:35:00.139543804 +0000 UTC m=+3146.037137577" watchObservedRunningTime="2025-12-04 10:35:00.145255919 +0000 UTC m=+3146.042849672" Dec 04 10:35:01 crc kubenswrapper[4693]: I1204 10:35:01.985840 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 04 10:35:03 crc kubenswrapper[4693]: I1204 10:35:03.162470 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"da1a36ac-5a8b-475d-8434-eb43b0f8a656","Type":"ContainerStarted","Data":"5c5fd62ba406b129efbb25274fbbbd1e4f012e1ffadfa270f6bbfd59997db366"} Dec 04 10:35:03 crc kubenswrapper[4693]: I1204 10:35:03.183392 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.169971059 podStartE2EDuration="54.183365019s" podCreationTimestamp="2025-12-04 10:34:09 +0000 UTC" firstStartedPulling="2025-12-04 10:34:10.970361341 +0000 UTC m=+3096.867955094" lastFinishedPulling="2025-12-04 10:35:01.983755311 +0000 UTC m=+3147.881349054" observedRunningTime="2025-12-04 10:35:03.179253988 +0000 UTC m=+3149.076847761" watchObservedRunningTime="2025-12-04 10:35:03.183365019 +0000 UTC m=+3149.080958772" Dec 04 10:35:03 crc kubenswrapper[4693]: I1204 10:35:03.315417 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-phcsv" Dec 04 10:35:03 crc kubenswrapper[4693]: I1204 10:35:03.315672 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-phcsv" Dec 04 10:35:03 crc kubenswrapper[4693]: I1204 10:35:03.461487 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:35:03 crc kubenswrapper[4693]: E1204 10:35:03.462035 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:35:04 crc kubenswrapper[4693]: I1204 10:35:04.361009 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-phcsv" podUID="6afb197d-2f90-470c-b868-71d2b3aeb445" containerName="registry-server" probeResult="failure" output=< Dec 04 10:35:04 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Dec 04 10:35:04 crc kubenswrapper[4693]: > Dec 04 10:35:13 crc kubenswrapper[4693]: I1204 10:35:13.364593 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-phcsv" Dec 04 10:35:13 crc kubenswrapper[4693]: I1204 10:35:13.415202 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-phcsv" Dec 04 10:35:13 crc kubenswrapper[4693]: I1204 10:35:13.612223 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phcsv"] Dec 04 10:35:15 crc kubenswrapper[4693]: I1204 10:35:15.297948 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-phcsv" podUID="6afb197d-2f90-470c-b868-71d2b3aeb445" containerName="registry-server" containerID="cri-o://69b72f9e6b01b5bb47dab2b4deed495698fa9b131adb7434a03ace8f01cc6c2b" gracePeriod=2 Dec 04 10:35:15 crc kubenswrapper[4693]: I1204 10:35:15.460717 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:35:15 crc kubenswrapper[4693]: E1204 10:35:15.461464 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:35:15 crc kubenswrapper[4693]: I1204 10:35:15.783720 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phcsv" Dec 04 10:35:15 crc kubenswrapper[4693]: I1204 10:35:15.856947 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6afb197d-2f90-470c-b868-71d2b3aeb445-catalog-content\") pod \"6afb197d-2f90-470c-b868-71d2b3aeb445\" (UID: \"6afb197d-2f90-470c-b868-71d2b3aeb445\") " Dec 04 10:35:15 crc kubenswrapper[4693]: I1204 10:35:15.857690 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfcrw\" (UniqueName: \"kubernetes.io/projected/6afb197d-2f90-470c-b868-71d2b3aeb445-kube-api-access-sfcrw\") pod \"6afb197d-2f90-470c-b868-71d2b3aeb445\" (UID: \"6afb197d-2f90-470c-b868-71d2b3aeb445\") " Dec 04 10:35:15 crc kubenswrapper[4693]: I1204 10:35:15.857794 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6afb197d-2f90-470c-b868-71d2b3aeb445-utilities\") pod \"6afb197d-2f90-470c-b868-71d2b3aeb445\" (UID: \"6afb197d-2f90-470c-b868-71d2b3aeb445\") " Dec 04 10:35:15 crc kubenswrapper[4693]: I1204 10:35:15.859001 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6afb197d-2f90-470c-b868-71d2b3aeb445-utilities" (OuterVolumeSpecName: "utilities") pod "6afb197d-2f90-470c-b868-71d2b3aeb445" (UID: "6afb197d-2f90-470c-b868-71d2b3aeb445"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4693]: I1204 10:35:15.864984 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6afb197d-2f90-470c-b868-71d2b3aeb445-kube-api-access-sfcrw" (OuterVolumeSpecName: "kube-api-access-sfcrw") pod "6afb197d-2f90-470c-b868-71d2b3aeb445" (UID: "6afb197d-2f90-470c-b868-71d2b3aeb445"). InnerVolumeSpecName "kube-api-access-sfcrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:15 crc kubenswrapper[4693]: I1204 10:35:15.961165 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfcrw\" (UniqueName: \"kubernetes.io/projected/6afb197d-2f90-470c-b868-71d2b3aeb445-kube-api-access-sfcrw\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4693]: I1204 10:35:15.961221 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6afb197d-2f90-470c-b868-71d2b3aeb445-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:15 crc kubenswrapper[4693]: I1204 10:35:15.970822 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6afb197d-2f90-470c-b868-71d2b3aeb445-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6afb197d-2f90-470c-b868-71d2b3aeb445" (UID: "6afb197d-2f90-470c-b868-71d2b3aeb445"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:16 crc kubenswrapper[4693]: I1204 10:35:16.062791 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6afb197d-2f90-470c-b868-71d2b3aeb445-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:16 crc kubenswrapper[4693]: I1204 10:35:16.317112 4693 generic.go:334] "Generic (PLEG): container finished" podID="6afb197d-2f90-470c-b868-71d2b3aeb445" containerID="69b72f9e6b01b5bb47dab2b4deed495698fa9b131adb7434a03ace8f01cc6c2b" exitCode=0 Dec 04 10:35:16 crc kubenswrapper[4693]: I1204 10:35:16.317167 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phcsv" event={"ID":"6afb197d-2f90-470c-b868-71d2b3aeb445","Type":"ContainerDied","Data":"69b72f9e6b01b5bb47dab2b4deed495698fa9b131adb7434a03ace8f01cc6c2b"} Dec 04 10:35:16 crc kubenswrapper[4693]: I1204 10:35:16.317219 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phcsv" event={"ID":"6afb197d-2f90-470c-b868-71d2b3aeb445","Type":"ContainerDied","Data":"aafeffefda35dd7e57dfe6969466795f52f5621567fdcf2a250295c70c0d2f27"} Dec 04 10:35:16 crc kubenswrapper[4693]: I1204 10:35:16.317218 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phcsv" Dec 04 10:35:16 crc kubenswrapper[4693]: I1204 10:35:16.317239 4693 scope.go:117] "RemoveContainer" containerID="69b72f9e6b01b5bb47dab2b4deed495698fa9b131adb7434a03ace8f01cc6c2b" Dec 04 10:35:16 crc kubenswrapper[4693]: I1204 10:35:16.364595 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phcsv"] Dec 04 10:35:16 crc kubenswrapper[4693]: I1204 10:35:16.386565 4693 scope.go:117] "RemoveContainer" containerID="b781d934557443a1a0e14525512528119b41cfc533631034aea9aba4d0c3f27e" Dec 04 10:35:16 crc kubenswrapper[4693]: I1204 10:35:16.426363 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-phcsv"] Dec 04 10:35:16 crc kubenswrapper[4693]: I1204 10:35:16.441929 4693 scope.go:117] "RemoveContainer" containerID="06da47316ddd92bda4748e5bf384bd29e60aedb3c760b07b7eec8f602e4ecb40" Dec 04 10:35:16 crc kubenswrapper[4693]: I1204 10:35:16.479227 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6afb197d-2f90-470c-b868-71d2b3aeb445" path="/var/lib/kubelet/pods/6afb197d-2f90-470c-b868-71d2b3aeb445/volumes" Dec 04 10:35:16 crc kubenswrapper[4693]: I1204 10:35:16.482465 4693 scope.go:117] "RemoveContainer" containerID="69b72f9e6b01b5bb47dab2b4deed495698fa9b131adb7434a03ace8f01cc6c2b" Dec 04 10:35:16 crc kubenswrapper[4693]: E1204 10:35:16.483015 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69b72f9e6b01b5bb47dab2b4deed495698fa9b131adb7434a03ace8f01cc6c2b\": container with ID starting with 69b72f9e6b01b5bb47dab2b4deed495698fa9b131adb7434a03ace8f01cc6c2b not found: ID does not exist" containerID="69b72f9e6b01b5bb47dab2b4deed495698fa9b131adb7434a03ace8f01cc6c2b" Dec 04 10:35:16 crc kubenswrapper[4693]: I1204 10:35:16.483050 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69b72f9e6b01b5bb47dab2b4deed495698fa9b131adb7434a03ace8f01cc6c2b"} err="failed to get container status \"69b72f9e6b01b5bb47dab2b4deed495698fa9b131adb7434a03ace8f01cc6c2b\": rpc error: code = NotFound desc = could not find container \"69b72f9e6b01b5bb47dab2b4deed495698fa9b131adb7434a03ace8f01cc6c2b\": container with ID starting with 69b72f9e6b01b5bb47dab2b4deed495698fa9b131adb7434a03ace8f01cc6c2b not found: ID does not exist" Dec 04 10:35:16 crc kubenswrapper[4693]: I1204 10:35:16.483073 4693 scope.go:117] "RemoveContainer" containerID="b781d934557443a1a0e14525512528119b41cfc533631034aea9aba4d0c3f27e" Dec 04 10:35:16 crc kubenswrapper[4693]: E1204 10:35:16.483479 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b781d934557443a1a0e14525512528119b41cfc533631034aea9aba4d0c3f27e\": container with ID starting with b781d934557443a1a0e14525512528119b41cfc533631034aea9aba4d0c3f27e not found: ID does not exist" containerID="b781d934557443a1a0e14525512528119b41cfc533631034aea9aba4d0c3f27e" Dec 04 10:35:16 crc kubenswrapper[4693]: I1204 10:35:16.483534 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b781d934557443a1a0e14525512528119b41cfc533631034aea9aba4d0c3f27e"} err="failed to get container status \"b781d934557443a1a0e14525512528119b41cfc533631034aea9aba4d0c3f27e\": rpc error: code = NotFound desc = could not find container \"b781d934557443a1a0e14525512528119b41cfc533631034aea9aba4d0c3f27e\": container with ID starting with b781d934557443a1a0e14525512528119b41cfc533631034aea9aba4d0c3f27e not found: ID does not exist" Dec 04 10:35:16 crc kubenswrapper[4693]: I1204 10:35:16.483569 4693 scope.go:117] "RemoveContainer" containerID="06da47316ddd92bda4748e5bf384bd29e60aedb3c760b07b7eec8f602e4ecb40" Dec 04 10:35:16 crc kubenswrapper[4693]: E1204 10:35:16.483970 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06da47316ddd92bda4748e5bf384bd29e60aedb3c760b07b7eec8f602e4ecb40\": container with ID starting with 06da47316ddd92bda4748e5bf384bd29e60aedb3c760b07b7eec8f602e4ecb40 not found: ID does not exist" containerID="06da47316ddd92bda4748e5bf384bd29e60aedb3c760b07b7eec8f602e4ecb40" Dec 04 10:35:16 crc kubenswrapper[4693]: I1204 10:35:16.484071 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06da47316ddd92bda4748e5bf384bd29e60aedb3c760b07b7eec8f602e4ecb40"} err="failed to get container status \"06da47316ddd92bda4748e5bf384bd29e60aedb3c760b07b7eec8f602e4ecb40\": rpc error: code = NotFound desc = could not find container \"06da47316ddd92bda4748e5bf384bd29e60aedb3c760b07b7eec8f602e4ecb40\": container with ID starting with 06da47316ddd92bda4748e5bf384bd29e60aedb3c760b07b7eec8f602e4ecb40 not found: ID does not exist" Dec 04 10:35:30 crc kubenswrapper[4693]: I1204 10:35:30.461039 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:35:30 crc kubenswrapper[4693]: E1204 10:35:30.462003 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:35:41 crc kubenswrapper[4693]: I1204 10:35:41.461473 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:35:41 crc kubenswrapper[4693]: E1204 10:35:41.462380 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:35:54 crc kubenswrapper[4693]: I1204 10:35:54.471835 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:35:54 crc kubenswrapper[4693]: E1204 10:35:54.473115 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:36:08 crc kubenswrapper[4693]: I1204 10:36:08.461670 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:36:08 crc kubenswrapper[4693]: E1204 10:36:08.463042 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:36:23 crc kubenswrapper[4693]: I1204 10:36:23.461729 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:36:23 crc kubenswrapper[4693]: E1204 10:36:23.462581 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:36:36 crc kubenswrapper[4693]: I1204 10:36:36.461770 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:36:36 crc kubenswrapper[4693]: E1204 10:36:36.469598 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:36:51 crc kubenswrapper[4693]: I1204 10:36:51.462720 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:36:51 crc kubenswrapper[4693]: E1204 10:36:51.464228 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:37:04 crc kubenswrapper[4693]: I1204 10:37:04.467952 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:37:04 crc kubenswrapper[4693]: E1204 10:37:04.468791 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:37:19 crc kubenswrapper[4693]: I1204 10:37:19.461628 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:37:19 crc kubenswrapper[4693]: E1204 10:37:19.462406 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:37:31 crc kubenswrapper[4693]: I1204 10:37:31.461213 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:37:31 crc kubenswrapper[4693]: E1204 10:37:31.462220 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:37:44 crc kubenswrapper[4693]: I1204 10:37:44.467672 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:37:44 crc kubenswrapper[4693]: E1204 10:37:44.468546 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:37:57 crc kubenswrapper[4693]: I1204 10:37:57.461293 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:37:57 crc kubenswrapper[4693]: E1204 10:37:57.462213 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:38:11 crc kubenswrapper[4693]: I1204 10:38:11.461805 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:38:11 crc kubenswrapper[4693]: E1204 10:38:11.462525 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:38:25 crc kubenswrapper[4693]: I1204 10:38:25.461747 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:38:25 crc kubenswrapper[4693]: E1204 10:38:25.462511 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:38:40 crc kubenswrapper[4693]: I1204 10:38:40.586094 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:38:40 crc kubenswrapper[4693]: E1204 10:38:40.586849 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:38:51 crc kubenswrapper[4693]: I1204 10:38:51.462076 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:38:51 crc kubenswrapper[4693]: E1204 10:38:51.462979 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:39:04 crc kubenswrapper[4693]: I1204 10:39:04.470918 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:39:04 crc kubenswrapper[4693]: E1204 10:39:04.471745 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:39:17 crc kubenswrapper[4693]: I1204 10:39:17.461605 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:39:17 crc kubenswrapper[4693]: E1204 10:39:17.462441 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:39:29 crc kubenswrapper[4693]: I1204 10:39:29.373975 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h5qpl"] Dec 04 10:39:29 crc kubenswrapper[4693]: E1204 10:39:29.374882 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6afb197d-2f90-470c-b868-71d2b3aeb445" containerName="extract-utilities" Dec 04 10:39:29 crc kubenswrapper[4693]: I1204 10:39:29.374897 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6afb197d-2f90-470c-b868-71d2b3aeb445" containerName="extract-utilities" Dec 04 10:39:29 crc kubenswrapper[4693]: E1204 10:39:29.374925 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6afb197d-2f90-470c-b868-71d2b3aeb445" containerName="extract-content" Dec 04 10:39:29 crc kubenswrapper[4693]: I1204 10:39:29.374931 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6afb197d-2f90-470c-b868-71d2b3aeb445" containerName="extract-content" Dec 04 10:39:29 crc kubenswrapper[4693]: E1204 10:39:29.374948 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6afb197d-2f90-470c-b868-71d2b3aeb445" containerName="registry-server" Dec 04 10:39:29 crc kubenswrapper[4693]: I1204 10:39:29.374954 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6afb197d-2f90-470c-b868-71d2b3aeb445" containerName="registry-server" Dec 04 10:39:29 crc kubenswrapper[4693]: I1204 10:39:29.375152 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6afb197d-2f90-470c-b868-71d2b3aeb445" containerName="registry-server" Dec 04 10:39:29 crc kubenswrapper[4693]: I1204 10:39:29.376538 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5qpl" Dec 04 10:39:29 crc kubenswrapper[4693]: I1204 10:39:29.393866 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h5qpl"] Dec 04 10:39:29 crc kubenswrapper[4693]: I1204 10:39:29.507288 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0cc0b09-861b-4e5b-baac-046c695ce29c-catalog-content\") pod \"community-operators-h5qpl\" (UID: \"e0cc0b09-861b-4e5b-baac-046c695ce29c\") " pod="openshift-marketplace/community-operators-h5qpl" Dec 04 10:39:29 crc kubenswrapper[4693]: I1204 10:39:29.507402 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw55x\" (UniqueName: \"kubernetes.io/projected/e0cc0b09-861b-4e5b-baac-046c695ce29c-kube-api-access-qw55x\") pod \"community-operators-h5qpl\" (UID: \"e0cc0b09-861b-4e5b-baac-046c695ce29c\") " pod="openshift-marketplace/community-operators-h5qpl" Dec 04 10:39:29 crc kubenswrapper[4693]: I1204 10:39:29.507513 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0cc0b09-861b-4e5b-baac-046c695ce29c-utilities\") pod \"community-operators-h5qpl\" (UID: \"e0cc0b09-861b-4e5b-baac-046c695ce29c\") " pod="openshift-marketplace/community-operators-h5qpl" Dec 04 10:39:29 crc kubenswrapper[4693]: I1204 10:39:29.609377 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0cc0b09-861b-4e5b-baac-046c695ce29c-utilities\") pod \"community-operators-h5qpl\" (UID: \"e0cc0b09-861b-4e5b-baac-046c695ce29c\") " pod="openshift-marketplace/community-operators-h5qpl" Dec 04 10:39:29 crc kubenswrapper[4693]: I1204 10:39:29.609518 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0cc0b09-861b-4e5b-baac-046c695ce29c-catalog-content\") pod \"community-operators-h5qpl\" (UID: \"e0cc0b09-861b-4e5b-baac-046c695ce29c\") " pod="openshift-marketplace/community-operators-h5qpl" Dec 04 10:39:29 crc kubenswrapper[4693]: I1204 10:39:29.609563 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw55x\" (UniqueName: \"kubernetes.io/projected/e0cc0b09-861b-4e5b-baac-046c695ce29c-kube-api-access-qw55x\") pod \"community-operators-h5qpl\" (UID: \"e0cc0b09-861b-4e5b-baac-046c695ce29c\") " pod="openshift-marketplace/community-operators-h5qpl" Dec 04 10:39:29 crc kubenswrapper[4693]: I1204 10:39:29.610016 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0cc0b09-861b-4e5b-baac-046c695ce29c-utilities\") pod \"community-operators-h5qpl\" (UID: \"e0cc0b09-861b-4e5b-baac-046c695ce29c\") " pod="openshift-marketplace/community-operators-h5qpl" Dec 04 10:39:29 crc kubenswrapper[4693]: I1204 10:39:29.610099 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0cc0b09-861b-4e5b-baac-046c695ce29c-catalog-content\") pod \"community-operators-h5qpl\" (UID: \"e0cc0b09-861b-4e5b-baac-046c695ce29c\") " pod="openshift-marketplace/community-operators-h5qpl" Dec 04 10:39:29 crc kubenswrapper[4693]: I1204 10:39:29.660149 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw55x\" (UniqueName: \"kubernetes.io/projected/e0cc0b09-861b-4e5b-baac-046c695ce29c-kube-api-access-qw55x\") pod \"community-operators-h5qpl\" (UID: \"e0cc0b09-861b-4e5b-baac-046c695ce29c\") " pod="openshift-marketplace/community-operators-h5qpl" Dec 04 10:39:29 crc kubenswrapper[4693]: I1204 10:39:29.702086 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5qpl" Dec 04 10:39:30 crc kubenswrapper[4693]: I1204 10:39:30.292398 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h5qpl"] Dec 04 10:39:30 crc kubenswrapper[4693]: I1204 10:39:30.672880 4693 generic.go:334] "Generic (PLEG): container finished" podID="e0cc0b09-861b-4e5b-baac-046c695ce29c" containerID="9daf5023026f8772e91d5b6a5efda5c167a3fe0acb4f1f5a7c15750be5067992" exitCode=0 Dec 04 10:39:30 crc kubenswrapper[4693]: I1204 10:39:30.672964 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5qpl" event={"ID":"e0cc0b09-861b-4e5b-baac-046c695ce29c","Type":"ContainerDied","Data":"9daf5023026f8772e91d5b6a5efda5c167a3fe0acb4f1f5a7c15750be5067992"} Dec 04 10:39:30 crc kubenswrapper[4693]: I1204 10:39:30.672996 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5qpl" event={"ID":"e0cc0b09-861b-4e5b-baac-046c695ce29c","Type":"ContainerStarted","Data":"d04b503972615173bfa9f1e8e16fa6711b01007270fb3da8fc71164be19706c9"} Dec 04 10:39:30 crc kubenswrapper[4693]: I1204 10:39:30.676800 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:39:31 crc kubenswrapper[4693]: I1204 10:39:31.461471 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:39:31 crc kubenswrapper[4693]: I1204 10:39:31.685106 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"44d11ce84a57f06206b908e6d45309b9aa24ca82e2e438a566c1f7e6bf0f7fc8"} Dec 04 10:39:32 crc kubenswrapper[4693]: I1204 10:39:32.696521 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5qpl" event={"ID":"e0cc0b09-861b-4e5b-baac-046c695ce29c","Type":"ContainerStarted","Data":"620574c2c4e10c92d515debc844d3bbf3de2a71378038413773e047e0d9bfe32"} Dec 04 10:39:34 crc kubenswrapper[4693]: I1204 10:39:34.714380 4693 generic.go:334] "Generic (PLEG): container finished" podID="e0cc0b09-861b-4e5b-baac-046c695ce29c" containerID="620574c2c4e10c92d515debc844d3bbf3de2a71378038413773e047e0d9bfe32" exitCode=0 Dec 04 10:39:34 crc kubenswrapper[4693]: I1204 10:39:34.714461 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5qpl" event={"ID":"e0cc0b09-861b-4e5b-baac-046c695ce29c","Type":"ContainerDied","Data":"620574c2c4e10c92d515debc844d3bbf3de2a71378038413773e047e0d9bfe32"} Dec 04 10:39:37 crc kubenswrapper[4693]: I1204 10:39:37.744528 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5qpl" event={"ID":"e0cc0b09-861b-4e5b-baac-046c695ce29c","Type":"ContainerStarted","Data":"b4939bf831e944addaae305486fd69be2c881aa1098c3a6b21761031c25d5a81"} Dec 04 10:39:37 crc kubenswrapper[4693]: I1204 10:39:37.777883 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h5qpl" podStartSLOduration=2.704129378 podStartE2EDuration="8.777861598s" podCreationTimestamp="2025-12-04 10:39:29 +0000 UTC" firstStartedPulling="2025-12-04 10:39:30.67649093 +0000 UTC m=+3416.574084683" lastFinishedPulling="2025-12-04 10:39:36.75022315 +0000 UTC m=+3422.647816903" observedRunningTime="2025-12-04 10:39:37.768064014 +0000 UTC m=+3423.665657777" watchObservedRunningTime="2025-12-04 10:39:37.777861598 +0000 UTC m=+3423.675455351" Dec 04 10:39:39 crc kubenswrapper[4693]: I1204 10:39:39.703889 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h5qpl" Dec 04 10:39:39 crc kubenswrapper[4693]: I1204 10:39:39.704529 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h5qpl" Dec 04 10:39:39 crc kubenswrapper[4693]: I1204 10:39:39.752344 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h5qpl" Dec 04 10:39:49 crc kubenswrapper[4693]: I1204 10:39:49.747821 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h5qpl" Dec 04 10:39:49 crc kubenswrapper[4693]: I1204 10:39:49.799281 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h5qpl"] Dec 04 10:39:49 crc kubenswrapper[4693]: I1204 10:39:49.853421 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h5qpl" podUID="e0cc0b09-861b-4e5b-baac-046c695ce29c" containerName="registry-server" containerID="cri-o://b4939bf831e944addaae305486fd69be2c881aa1098c3a6b21761031c25d5a81" gracePeriod=2 Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.565951 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5qpl" Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.643220 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0cc0b09-861b-4e5b-baac-046c695ce29c-utilities\") pod \"e0cc0b09-861b-4e5b-baac-046c695ce29c\" (UID: \"e0cc0b09-861b-4e5b-baac-046c695ce29c\") " Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.643357 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0cc0b09-861b-4e5b-baac-046c695ce29c-catalog-content\") pod \"e0cc0b09-861b-4e5b-baac-046c695ce29c\" (UID: \"e0cc0b09-861b-4e5b-baac-046c695ce29c\") " Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.643471 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw55x\" (UniqueName: \"kubernetes.io/projected/e0cc0b09-861b-4e5b-baac-046c695ce29c-kube-api-access-qw55x\") pod \"e0cc0b09-861b-4e5b-baac-046c695ce29c\" (UID: \"e0cc0b09-861b-4e5b-baac-046c695ce29c\") " Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.644680 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0cc0b09-861b-4e5b-baac-046c695ce29c-utilities" (OuterVolumeSpecName: "utilities") pod "e0cc0b09-861b-4e5b-baac-046c695ce29c" (UID: "e0cc0b09-861b-4e5b-baac-046c695ce29c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.650597 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0cc0b09-861b-4e5b-baac-046c695ce29c-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.650794 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0cc0b09-861b-4e5b-baac-046c695ce29c-kube-api-access-qw55x" (OuterVolumeSpecName: "kube-api-access-qw55x") pod "e0cc0b09-861b-4e5b-baac-046c695ce29c" (UID: "e0cc0b09-861b-4e5b-baac-046c695ce29c"). InnerVolumeSpecName "kube-api-access-qw55x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.697811 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0cc0b09-861b-4e5b-baac-046c695ce29c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0cc0b09-861b-4e5b-baac-046c695ce29c" (UID: "e0cc0b09-861b-4e5b-baac-046c695ce29c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.752678 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0cc0b09-861b-4e5b-baac-046c695ce29c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.752714 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qw55x\" (UniqueName: \"kubernetes.io/projected/e0cc0b09-861b-4e5b-baac-046c695ce29c-kube-api-access-qw55x\") on node \"crc\" DevicePath \"\"" Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.863299 4693 generic.go:334] "Generic (PLEG): container finished" podID="e0cc0b09-861b-4e5b-baac-046c695ce29c" containerID="b4939bf831e944addaae305486fd69be2c881aa1098c3a6b21761031c25d5a81" exitCode=0 Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.863361 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5qpl" event={"ID":"e0cc0b09-861b-4e5b-baac-046c695ce29c","Type":"ContainerDied","Data":"b4939bf831e944addaae305486fd69be2c881aa1098c3a6b21761031c25d5a81"} Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.863390 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5qpl" event={"ID":"e0cc0b09-861b-4e5b-baac-046c695ce29c","Type":"ContainerDied","Data":"d04b503972615173bfa9f1e8e16fa6711b01007270fb3da8fc71164be19706c9"} Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.863406 4693 scope.go:117] "RemoveContainer" containerID="b4939bf831e944addaae305486fd69be2c881aa1098c3a6b21761031c25d5a81" Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.863470 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5qpl" Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.884045 4693 scope.go:117] "RemoveContainer" containerID="620574c2c4e10c92d515debc844d3bbf3de2a71378038413773e047e0d9bfe32" Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.904271 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h5qpl"] Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.920704 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h5qpl"] Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.924726 4693 scope.go:117] "RemoveContainer" containerID="9daf5023026f8772e91d5b6a5efda5c167a3fe0acb4f1f5a7c15750be5067992" Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.955666 4693 scope.go:117] "RemoveContainer" containerID="b4939bf831e944addaae305486fd69be2c881aa1098c3a6b21761031c25d5a81" Dec 04 10:39:50 crc kubenswrapper[4693]: E1204 10:39:50.956177 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4939bf831e944addaae305486fd69be2c881aa1098c3a6b21761031c25d5a81\": container with ID starting with b4939bf831e944addaae305486fd69be2c881aa1098c3a6b21761031c25d5a81 not found: ID does not exist" containerID="b4939bf831e944addaae305486fd69be2c881aa1098c3a6b21761031c25d5a81" Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.956230 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4939bf831e944addaae305486fd69be2c881aa1098c3a6b21761031c25d5a81"} err="failed to get container status \"b4939bf831e944addaae305486fd69be2c881aa1098c3a6b21761031c25d5a81\": rpc error: code = NotFound desc = could not find container \"b4939bf831e944addaae305486fd69be2c881aa1098c3a6b21761031c25d5a81\": container with ID starting with b4939bf831e944addaae305486fd69be2c881aa1098c3a6b21761031c25d5a81 not found: ID does not exist" Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.956259 4693 scope.go:117] "RemoveContainer" containerID="620574c2c4e10c92d515debc844d3bbf3de2a71378038413773e047e0d9bfe32" Dec 04 10:39:50 crc kubenswrapper[4693]: E1204 10:39:50.957055 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"620574c2c4e10c92d515debc844d3bbf3de2a71378038413773e047e0d9bfe32\": container with ID starting with 620574c2c4e10c92d515debc844d3bbf3de2a71378038413773e047e0d9bfe32 not found: ID does not exist" containerID="620574c2c4e10c92d515debc844d3bbf3de2a71378038413773e047e0d9bfe32" Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.957082 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"620574c2c4e10c92d515debc844d3bbf3de2a71378038413773e047e0d9bfe32"} err="failed to get container status \"620574c2c4e10c92d515debc844d3bbf3de2a71378038413773e047e0d9bfe32\": rpc error: code = NotFound desc = could not find container \"620574c2c4e10c92d515debc844d3bbf3de2a71378038413773e047e0d9bfe32\": container with ID starting with 620574c2c4e10c92d515debc844d3bbf3de2a71378038413773e047e0d9bfe32 not found: ID does not exist" Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.957105 4693 scope.go:117] "RemoveContainer" containerID="9daf5023026f8772e91d5b6a5efda5c167a3fe0acb4f1f5a7c15750be5067992" Dec 04 10:39:50 crc kubenswrapper[4693]: E1204 10:39:50.957632 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9daf5023026f8772e91d5b6a5efda5c167a3fe0acb4f1f5a7c15750be5067992\": container with ID starting with 9daf5023026f8772e91d5b6a5efda5c167a3fe0acb4f1f5a7c15750be5067992 not found: ID does not exist" containerID="9daf5023026f8772e91d5b6a5efda5c167a3fe0acb4f1f5a7c15750be5067992" Dec 04 10:39:50 crc kubenswrapper[4693]: I1204 10:39:50.957655 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9daf5023026f8772e91d5b6a5efda5c167a3fe0acb4f1f5a7c15750be5067992"} err="failed to get container status \"9daf5023026f8772e91d5b6a5efda5c167a3fe0acb4f1f5a7c15750be5067992\": rpc error: code = NotFound desc = could not find container \"9daf5023026f8772e91d5b6a5efda5c167a3fe0acb4f1f5a7c15750be5067992\": container with ID starting with 9daf5023026f8772e91d5b6a5efda5c167a3fe0acb4f1f5a7c15750be5067992 not found: ID does not exist" Dec 04 10:39:52 crc kubenswrapper[4693]: I1204 10:39:52.474162 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0cc0b09-861b-4e5b-baac-046c695ce29c" path="/var/lib/kubelet/pods/e0cc0b09-861b-4e5b-baac-046c695ce29c/volumes" Dec 04 10:41:52 crc kubenswrapper[4693]: I1204 10:41:52.272937 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:41:52 crc kubenswrapper[4693]: I1204 10:41:52.273574 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:42:22 crc kubenswrapper[4693]: I1204 10:42:22.273223 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:42:22 crc kubenswrapper[4693]: I1204 10:42:22.273877 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:42:52 crc kubenswrapper[4693]: I1204 10:42:52.273365 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:42:52 crc kubenswrapper[4693]: I1204 10:42:52.274159 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:42:52 crc kubenswrapper[4693]: I1204 10:42:52.274217 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 10:42:52 crc kubenswrapper[4693]: I1204 10:42:52.275067 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"44d11ce84a57f06206b908e6d45309b9aa24ca82e2e438a566c1f7e6bf0f7fc8"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:42:52 crc kubenswrapper[4693]: I1204 10:42:52.275125 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://44d11ce84a57f06206b908e6d45309b9aa24ca82e2e438a566c1f7e6bf0f7fc8" gracePeriod=600 Dec 04 10:42:52 crc kubenswrapper[4693]: I1204 10:42:52.629083 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="44d11ce84a57f06206b908e6d45309b9aa24ca82e2e438a566c1f7e6bf0f7fc8" exitCode=0 Dec 04 10:42:52 crc kubenswrapper[4693]: I1204 10:42:52.629286 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"44d11ce84a57f06206b908e6d45309b9aa24ca82e2e438a566c1f7e6bf0f7fc8"} Dec 04 10:42:52 crc kubenswrapper[4693]: I1204 10:42:52.629466 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a"} Dec 04 10:42:52 crc kubenswrapper[4693]: I1204 10:42:52.629488 4693 scope.go:117] "RemoveContainer" containerID="c3858867eeeb8e4b56ab5ae4d552adb031931d1a172111641f8e864c5f34d0f4" Dec 04 10:44:19 crc kubenswrapper[4693]: I1204 10:44:19.001528 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tf947"] Dec 04 10:44:19 crc kubenswrapper[4693]: E1204 10:44:19.002600 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0cc0b09-861b-4e5b-baac-046c695ce29c" containerName="extract-utilities" Dec 04 10:44:19 crc kubenswrapper[4693]: I1204 10:44:19.002618 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0cc0b09-861b-4e5b-baac-046c695ce29c" containerName="extract-utilities" Dec 04 10:44:19 crc kubenswrapper[4693]: E1204 10:44:19.002645 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0cc0b09-861b-4e5b-baac-046c695ce29c" containerName="registry-server" Dec 04 10:44:19 crc kubenswrapper[4693]: I1204 10:44:19.002653 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0cc0b09-861b-4e5b-baac-046c695ce29c" containerName="registry-server" Dec 04 10:44:19 crc kubenswrapper[4693]: E1204 10:44:19.002680 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0cc0b09-861b-4e5b-baac-046c695ce29c" containerName="extract-content" Dec 04 10:44:19 crc kubenswrapper[4693]: I1204 10:44:19.002688 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0cc0b09-861b-4e5b-baac-046c695ce29c" containerName="extract-content" Dec 04 10:44:19 crc kubenswrapper[4693]: I1204 10:44:19.002949 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0cc0b09-861b-4e5b-baac-046c695ce29c" containerName="registry-server" Dec 04 10:44:19 crc kubenswrapper[4693]: I1204 10:44:19.004742 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tf947" Dec 04 10:44:19 crc kubenswrapper[4693]: I1204 10:44:19.014299 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tf947"] Dec 04 10:44:19 crc kubenswrapper[4693]: I1204 10:44:19.127941 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e27c0b0-e856-4672-94b7-e896295653ec-catalog-content\") pod \"certified-operators-tf947\" (UID: \"7e27c0b0-e856-4672-94b7-e896295653ec\") " pod="openshift-marketplace/certified-operators-tf947" Dec 04 10:44:19 crc kubenswrapper[4693]: I1204 10:44:19.128049 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e27c0b0-e856-4672-94b7-e896295653ec-utilities\") pod \"certified-operators-tf947\" (UID: \"7e27c0b0-e856-4672-94b7-e896295653ec\") " pod="openshift-marketplace/certified-operators-tf947" Dec 04 10:44:19 crc kubenswrapper[4693]: I1204 10:44:19.128108 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jkpn\" (UniqueName: \"kubernetes.io/projected/7e27c0b0-e856-4672-94b7-e896295653ec-kube-api-access-6jkpn\") pod \"certified-operators-tf947\" (UID: \"7e27c0b0-e856-4672-94b7-e896295653ec\") " pod="openshift-marketplace/certified-operators-tf947" Dec 04 10:44:19 crc kubenswrapper[4693]: I1204 10:44:19.230588 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e27c0b0-e856-4672-94b7-e896295653ec-catalog-content\") pod \"certified-operators-tf947\" (UID: \"7e27c0b0-e856-4672-94b7-e896295653ec\") " pod="openshift-marketplace/certified-operators-tf947" Dec 04 10:44:19 crc kubenswrapper[4693]: I1204 10:44:19.230743 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e27c0b0-e856-4672-94b7-e896295653ec-utilities\") pod \"certified-operators-tf947\" (UID: \"7e27c0b0-e856-4672-94b7-e896295653ec\") " pod="openshift-marketplace/certified-operators-tf947" Dec 04 10:44:19 crc kubenswrapper[4693]: I1204 10:44:19.230840 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jkpn\" (UniqueName: \"kubernetes.io/projected/7e27c0b0-e856-4672-94b7-e896295653ec-kube-api-access-6jkpn\") pod \"certified-operators-tf947\" (UID: \"7e27c0b0-e856-4672-94b7-e896295653ec\") " pod="openshift-marketplace/certified-operators-tf947" Dec 04 10:44:19 crc kubenswrapper[4693]: I1204 10:44:19.231183 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e27c0b0-e856-4672-94b7-e896295653ec-catalog-content\") pod \"certified-operators-tf947\" (UID: \"7e27c0b0-e856-4672-94b7-e896295653ec\") " pod="openshift-marketplace/certified-operators-tf947" Dec 04 10:44:19 crc kubenswrapper[4693]: I1204 10:44:19.231508 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e27c0b0-e856-4672-94b7-e896295653ec-utilities\") pod \"certified-operators-tf947\" (UID: \"7e27c0b0-e856-4672-94b7-e896295653ec\") " pod="openshift-marketplace/certified-operators-tf947" Dec 04 10:44:19 crc kubenswrapper[4693]: I1204 10:44:19.263814 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jkpn\" (UniqueName: \"kubernetes.io/projected/7e27c0b0-e856-4672-94b7-e896295653ec-kube-api-access-6jkpn\") pod \"certified-operators-tf947\" (UID: \"7e27c0b0-e856-4672-94b7-e896295653ec\") " pod="openshift-marketplace/certified-operators-tf947" Dec 04 10:44:19 crc kubenswrapper[4693]: I1204 10:44:19.331786 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tf947" Dec 04 10:44:19 crc kubenswrapper[4693]: I1204 10:44:19.988275 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tf947"] Dec 04 10:44:20 crc kubenswrapper[4693]: I1204 10:44:20.661602 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf947" event={"ID":"7e27c0b0-e856-4672-94b7-e896295653ec","Type":"ContainerStarted","Data":"6e9882c335056ca985a94c319f2dc76cf35b9ab7c385e65a2ef6742bce2ef75f"} Dec 04 10:44:21 crc kubenswrapper[4693]: I1204 10:44:21.673825 4693 generic.go:334] "Generic (PLEG): container finished" podID="7e27c0b0-e856-4672-94b7-e896295653ec" containerID="baf392cbd86414364e0375b2a00e02af627d57b17fa7bafb2704fa0829ffe935" exitCode=0 Dec 04 10:44:21 crc kubenswrapper[4693]: I1204 10:44:21.673875 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf947" event={"ID":"7e27c0b0-e856-4672-94b7-e896295653ec","Type":"ContainerDied","Data":"baf392cbd86414364e0375b2a00e02af627d57b17fa7bafb2704fa0829ffe935"} Dec 04 10:44:23 crc kubenswrapper[4693]: I1204 10:44:23.690865 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf947" event={"ID":"7e27c0b0-e856-4672-94b7-e896295653ec","Type":"ContainerStarted","Data":"dd93cf31ce0499572c84631aa9820f4364ef824fd5d77430659a01ce77d5afd0"} Dec 04 10:44:24 crc kubenswrapper[4693]: I1204 10:44:24.703532 4693 generic.go:334] "Generic (PLEG): container finished" podID="7e27c0b0-e856-4672-94b7-e896295653ec" containerID="dd93cf31ce0499572c84631aa9820f4364ef824fd5d77430659a01ce77d5afd0" exitCode=0 Dec 04 10:44:24 crc kubenswrapper[4693]: I1204 10:44:24.703584 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf947" event={"ID":"7e27c0b0-e856-4672-94b7-e896295653ec","Type":"ContainerDied","Data":"dd93cf31ce0499572c84631aa9820f4364ef824fd5d77430659a01ce77d5afd0"} Dec 04 10:44:25 crc kubenswrapper[4693]: I1204 10:44:25.715136 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf947" event={"ID":"7e27c0b0-e856-4672-94b7-e896295653ec","Type":"ContainerStarted","Data":"42159723a71f970dc92efe524b1cc7d28d1baa46f59689a175a51bc50b07f262"} Dec 04 10:44:25 crc kubenswrapper[4693]: I1204 10:44:25.741286 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tf947" podStartSLOduration=4.304462793 podStartE2EDuration="7.741269628s" podCreationTimestamp="2025-12-04 10:44:18 +0000 UTC" firstStartedPulling="2025-12-04 10:44:21.67619808 +0000 UTC m=+3707.573791833" lastFinishedPulling="2025-12-04 10:44:25.113004915 +0000 UTC m=+3711.010598668" observedRunningTime="2025-12-04 10:44:25.734884566 +0000 UTC m=+3711.632478319" watchObservedRunningTime="2025-12-04 10:44:25.741269628 +0000 UTC m=+3711.638863381" Dec 04 10:44:29 crc kubenswrapper[4693]: I1204 10:44:29.332199 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tf947" Dec 04 10:44:29 crc kubenswrapper[4693]: I1204 10:44:29.332646 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tf947" Dec 04 10:44:29 crc kubenswrapper[4693]: I1204 10:44:29.384048 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tf947" Dec 04 10:44:39 crc kubenswrapper[4693]: I1204 10:44:39.393582 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tf947" Dec 04 10:44:39 crc kubenswrapper[4693]: I1204 10:44:39.454395 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tf947"] Dec 04 10:44:39 crc kubenswrapper[4693]: I1204 10:44:39.840821 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tf947" podUID="7e27c0b0-e856-4672-94b7-e896295653ec" containerName="registry-server" containerID="cri-o://42159723a71f970dc92efe524b1cc7d28d1baa46f59689a175a51bc50b07f262" gracePeriod=2 Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.583852 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tf947" Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.628108 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jkpn\" (UniqueName: \"kubernetes.io/projected/7e27c0b0-e856-4672-94b7-e896295653ec-kube-api-access-6jkpn\") pod \"7e27c0b0-e856-4672-94b7-e896295653ec\" (UID: \"7e27c0b0-e856-4672-94b7-e896295653ec\") " Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.628234 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e27c0b0-e856-4672-94b7-e896295653ec-catalog-content\") pod \"7e27c0b0-e856-4672-94b7-e896295653ec\" (UID: \"7e27c0b0-e856-4672-94b7-e896295653ec\") " Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.628313 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e27c0b0-e856-4672-94b7-e896295653ec-utilities\") pod \"7e27c0b0-e856-4672-94b7-e896295653ec\" (UID: \"7e27c0b0-e856-4672-94b7-e896295653ec\") " Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.629043 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e27c0b0-e856-4672-94b7-e896295653ec-utilities" (OuterVolumeSpecName: "utilities") pod "7e27c0b0-e856-4672-94b7-e896295653ec" (UID: "7e27c0b0-e856-4672-94b7-e896295653ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.652654 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e27c0b0-e856-4672-94b7-e896295653ec-kube-api-access-6jkpn" (OuterVolumeSpecName: "kube-api-access-6jkpn") pod "7e27c0b0-e856-4672-94b7-e896295653ec" (UID: "7e27c0b0-e856-4672-94b7-e896295653ec"). InnerVolumeSpecName "kube-api-access-6jkpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.681636 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e27c0b0-e856-4672-94b7-e896295653ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e27c0b0-e856-4672-94b7-e896295653ec" (UID: "7e27c0b0-e856-4672-94b7-e896295653ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.730209 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e27c0b0-e856-4672-94b7-e896295653ec-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.730591 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jkpn\" (UniqueName: \"kubernetes.io/projected/7e27c0b0-e856-4672-94b7-e896295653ec-kube-api-access-6jkpn\") on node \"crc\" DevicePath \"\"" Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.730651 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e27c0b0-e856-4672-94b7-e896295653ec-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.871494 4693 generic.go:334] "Generic (PLEG): container finished" podID="7e27c0b0-e856-4672-94b7-e896295653ec" containerID="42159723a71f970dc92efe524b1cc7d28d1baa46f59689a175a51bc50b07f262" exitCode=0 Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.871546 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf947" event={"ID":"7e27c0b0-e856-4672-94b7-e896295653ec","Type":"ContainerDied","Data":"42159723a71f970dc92efe524b1cc7d28d1baa46f59689a175a51bc50b07f262"} Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.871819 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf947" event={"ID":"7e27c0b0-e856-4672-94b7-e896295653ec","Type":"ContainerDied","Data":"6e9882c335056ca985a94c319f2dc76cf35b9ab7c385e65a2ef6742bce2ef75f"} Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.871632 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tf947" Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.871921 4693 scope.go:117] "RemoveContainer" containerID="42159723a71f970dc92efe524b1cc7d28d1baa46f59689a175a51bc50b07f262" Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.908010 4693 scope.go:117] "RemoveContainer" containerID="dd93cf31ce0499572c84631aa9820f4364ef824fd5d77430659a01ce77d5afd0" Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.927247 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tf947"] Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.937035 4693 scope.go:117] "RemoveContainer" containerID="baf392cbd86414364e0375b2a00e02af627d57b17fa7bafb2704fa0829ffe935" Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.940357 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tf947"] Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.982261 4693 scope.go:117] "RemoveContainer" containerID="42159723a71f970dc92efe524b1cc7d28d1baa46f59689a175a51bc50b07f262" Dec 04 10:44:40 crc kubenswrapper[4693]: E1204 10:44:40.982791 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42159723a71f970dc92efe524b1cc7d28d1baa46f59689a175a51bc50b07f262\": container with ID starting with 42159723a71f970dc92efe524b1cc7d28d1baa46f59689a175a51bc50b07f262 not found: ID does not exist" containerID="42159723a71f970dc92efe524b1cc7d28d1baa46f59689a175a51bc50b07f262" Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.982840 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42159723a71f970dc92efe524b1cc7d28d1baa46f59689a175a51bc50b07f262"} err="failed to get container status \"42159723a71f970dc92efe524b1cc7d28d1baa46f59689a175a51bc50b07f262\": rpc error: code = NotFound desc = could not find container \"42159723a71f970dc92efe524b1cc7d28d1baa46f59689a175a51bc50b07f262\": container with ID starting with 42159723a71f970dc92efe524b1cc7d28d1baa46f59689a175a51bc50b07f262 not found: ID does not exist" Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.982872 4693 scope.go:117] "RemoveContainer" containerID="dd93cf31ce0499572c84631aa9820f4364ef824fd5d77430659a01ce77d5afd0" Dec 04 10:44:40 crc kubenswrapper[4693]: E1204 10:44:40.983244 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd93cf31ce0499572c84631aa9820f4364ef824fd5d77430659a01ce77d5afd0\": container with ID starting with dd93cf31ce0499572c84631aa9820f4364ef824fd5d77430659a01ce77d5afd0 not found: ID does not exist" containerID="dd93cf31ce0499572c84631aa9820f4364ef824fd5d77430659a01ce77d5afd0" Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.983277 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd93cf31ce0499572c84631aa9820f4364ef824fd5d77430659a01ce77d5afd0"} err="failed to get container status \"dd93cf31ce0499572c84631aa9820f4364ef824fd5d77430659a01ce77d5afd0\": rpc error: code = NotFound desc = could not find container \"dd93cf31ce0499572c84631aa9820f4364ef824fd5d77430659a01ce77d5afd0\": container with ID starting with dd93cf31ce0499572c84631aa9820f4364ef824fd5d77430659a01ce77d5afd0 not found: ID does not exist" Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.983296 4693 scope.go:117] "RemoveContainer" containerID="baf392cbd86414364e0375b2a00e02af627d57b17fa7bafb2704fa0829ffe935" Dec 04 10:44:40 crc kubenswrapper[4693]: E1204 10:44:40.983586 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baf392cbd86414364e0375b2a00e02af627d57b17fa7bafb2704fa0829ffe935\": container with ID starting with baf392cbd86414364e0375b2a00e02af627d57b17fa7bafb2704fa0829ffe935 not found: ID does not exist" containerID="baf392cbd86414364e0375b2a00e02af627d57b17fa7bafb2704fa0829ffe935" Dec 04 10:44:40 crc kubenswrapper[4693]: I1204 10:44:40.983617 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baf392cbd86414364e0375b2a00e02af627d57b17fa7bafb2704fa0829ffe935"} err="failed to get container status \"baf392cbd86414364e0375b2a00e02af627d57b17fa7bafb2704fa0829ffe935\": rpc error: code = NotFound desc = could not find container \"baf392cbd86414364e0375b2a00e02af627d57b17fa7bafb2704fa0829ffe935\": container with ID starting with baf392cbd86414364e0375b2a00e02af627d57b17fa7bafb2704fa0829ffe935 not found: ID does not exist" Dec 04 10:44:42 crc kubenswrapper[4693]: I1204 10:44:42.473222 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e27c0b0-e856-4672-94b7-e896295653ec" path="/var/lib/kubelet/pods/7e27c0b0-e856-4672-94b7-e896295653ec/volumes" Dec 04 10:44:52 crc kubenswrapper[4693]: I1204 10:44:52.272843 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:44:52 crc kubenswrapper[4693]: I1204 10:44:52.273500 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:44:59 crc kubenswrapper[4693]: I1204 10:44:59.656511 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tctjr"] Dec 04 10:44:59 crc kubenswrapper[4693]: E1204 10:44:59.657282 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e27c0b0-e856-4672-94b7-e896295653ec" containerName="registry-server" Dec 04 10:44:59 crc kubenswrapper[4693]: I1204 10:44:59.657295 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e27c0b0-e856-4672-94b7-e896295653ec" containerName="registry-server" Dec 04 10:44:59 crc kubenswrapper[4693]: E1204 10:44:59.657335 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e27c0b0-e856-4672-94b7-e896295653ec" containerName="extract-content" Dec 04 10:44:59 crc kubenswrapper[4693]: I1204 10:44:59.657343 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e27c0b0-e856-4672-94b7-e896295653ec" containerName="extract-content" Dec 04 10:44:59 crc kubenswrapper[4693]: E1204 10:44:59.657366 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e27c0b0-e856-4672-94b7-e896295653ec" containerName="extract-utilities" Dec 04 10:44:59 crc kubenswrapper[4693]: I1204 10:44:59.657372 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e27c0b0-e856-4672-94b7-e896295653ec" containerName="extract-utilities" Dec 04 10:44:59 crc kubenswrapper[4693]: I1204 10:44:59.657589 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e27c0b0-e856-4672-94b7-e896295653ec" containerName="registry-server" Dec 04 10:44:59 crc kubenswrapper[4693]: I1204 10:44:59.659848 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tctjr" Dec 04 10:44:59 crc kubenswrapper[4693]: I1204 10:44:59.697704 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tctjr"] Dec 04 10:44:59 crc kubenswrapper[4693]: I1204 10:44:59.732159 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqmbr\" (UniqueName: \"kubernetes.io/projected/7ee8418e-9378-4fa2-8f9c-802898c06e25-kube-api-access-nqmbr\") pod \"redhat-operators-tctjr\" (UID: \"7ee8418e-9378-4fa2-8f9c-802898c06e25\") " pod="openshift-marketplace/redhat-operators-tctjr" Dec 04 10:44:59 crc kubenswrapper[4693]: I1204 10:44:59.732267 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ee8418e-9378-4fa2-8f9c-802898c06e25-catalog-content\") pod \"redhat-operators-tctjr\" (UID: \"7ee8418e-9378-4fa2-8f9c-802898c06e25\") " pod="openshift-marketplace/redhat-operators-tctjr" Dec 04 10:44:59 crc kubenswrapper[4693]: I1204 10:44:59.732306 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ee8418e-9378-4fa2-8f9c-802898c06e25-utilities\") pod \"redhat-operators-tctjr\" (UID: \"7ee8418e-9378-4fa2-8f9c-802898c06e25\") " pod="openshift-marketplace/redhat-operators-tctjr" Dec 04 10:44:59 crc kubenswrapper[4693]: I1204 10:44:59.833936 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ee8418e-9378-4fa2-8f9c-802898c06e25-catalog-content\") pod \"redhat-operators-tctjr\" (UID: \"7ee8418e-9378-4fa2-8f9c-802898c06e25\") " pod="openshift-marketplace/redhat-operators-tctjr" Dec 04 10:44:59 crc kubenswrapper[4693]: I1204 10:44:59.834004 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ee8418e-9378-4fa2-8f9c-802898c06e25-utilities\") pod \"redhat-operators-tctjr\" (UID: \"7ee8418e-9378-4fa2-8f9c-802898c06e25\") " pod="openshift-marketplace/redhat-operators-tctjr" Dec 04 10:44:59 crc kubenswrapper[4693]: I1204 10:44:59.834131 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqmbr\" (UniqueName: \"kubernetes.io/projected/7ee8418e-9378-4fa2-8f9c-802898c06e25-kube-api-access-nqmbr\") pod \"redhat-operators-tctjr\" (UID: \"7ee8418e-9378-4fa2-8f9c-802898c06e25\") " pod="openshift-marketplace/redhat-operators-tctjr" Dec 04 10:44:59 crc kubenswrapper[4693]: I1204 10:44:59.834662 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ee8418e-9378-4fa2-8f9c-802898c06e25-catalog-content\") pod \"redhat-operators-tctjr\" (UID: \"7ee8418e-9378-4fa2-8f9c-802898c06e25\") " pod="openshift-marketplace/redhat-operators-tctjr" Dec 04 10:44:59 crc kubenswrapper[4693]: I1204 10:44:59.834692 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ee8418e-9378-4fa2-8f9c-802898c06e25-utilities\") pod \"redhat-operators-tctjr\" (UID: \"7ee8418e-9378-4fa2-8f9c-802898c06e25\") " pod="openshift-marketplace/redhat-operators-tctjr" Dec 04 10:44:59 crc kubenswrapper[4693]: I1204 10:44:59.855967 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqmbr\" (UniqueName: \"kubernetes.io/projected/7ee8418e-9378-4fa2-8f9c-802898c06e25-kube-api-access-nqmbr\") pod \"redhat-operators-tctjr\" (UID: \"7ee8418e-9378-4fa2-8f9c-802898c06e25\") " pod="openshift-marketplace/redhat-operators-tctjr" Dec 04 10:44:59 crc kubenswrapper[4693]: I1204 10:44:59.982757 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tctjr" Dec 04 10:45:00 crc kubenswrapper[4693]: I1204 10:45:00.188594 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl"] Dec 04 10:45:00 crc kubenswrapper[4693]: I1204 10:45:00.190556 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl" Dec 04 10:45:00 crc kubenswrapper[4693]: I1204 10:45:00.193286 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 10:45:00 crc kubenswrapper[4693]: I1204 10:45:00.197144 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl"] Dec 04 10:45:00 crc kubenswrapper[4693]: I1204 10:45:00.197194 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 10:45:00 crc kubenswrapper[4693]: I1204 10:45:00.242939 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8fa56cf-2051-47fc-9ac0-d28653185bd6-secret-volume\") pod \"collect-profiles-29414085-xg6bl\" (UID: \"e8fa56cf-2051-47fc-9ac0-d28653185bd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl" Dec 04 10:45:00 crc kubenswrapper[4693]: I1204 10:45:00.243039 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2m7z\" (UniqueName: \"kubernetes.io/projected/e8fa56cf-2051-47fc-9ac0-d28653185bd6-kube-api-access-c2m7z\") pod \"collect-profiles-29414085-xg6bl\" (UID: \"e8fa56cf-2051-47fc-9ac0-d28653185bd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl" Dec 04 10:45:00 crc kubenswrapper[4693]: I1204 10:45:00.243079 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8fa56cf-2051-47fc-9ac0-d28653185bd6-config-volume\") pod \"collect-profiles-29414085-xg6bl\" (UID: \"e8fa56cf-2051-47fc-9ac0-d28653185bd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl" Dec 04 10:45:00 crc kubenswrapper[4693]: I1204 10:45:00.349778 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8fa56cf-2051-47fc-9ac0-d28653185bd6-secret-volume\") pod \"collect-profiles-29414085-xg6bl\" (UID: \"e8fa56cf-2051-47fc-9ac0-d28653185bd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl" Dec 04 10:45:00 crc kubenswrapper[4693]: I1204 10:45:00.349956 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2m7z\" (UniqueName: \"kubernetes.io/projected/e8fa56cf-2051-47fc-9ac0-d28653185bd6-kube-api-access-c2m7z\") pod \"collect-profiles-29414085-xg6bl\" (UID: \"e8fa56cf-2051-47fc-9ac0-d28653185bd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl" Dec 04 10:45:00 crc kubenswrapper[4693]: I1204 10:45:00.350041 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8fa56cf-2051-47fc-9ac0-d28653185bd6-config-volume\") pod \"collect-profiles-29414085-xg6bl\" (UID: \"e8fa56cf-2051-47fc-9ac0-d28653185bd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl" Dec 04 10:45:00 crc kubenswrapper[4693]: I1204 10:45:00.351246 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8fa56cf-2051-47fc-9ac0-d28653185bd6-config-volume\") pod \"collect-profiles-29414085-xg6bl\" (UID: \"e8fa56cf-2051-47fc-9ac0-d28653185bd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl" Dec 04 10:45:00 crc kubenswrapper[4693]: I1204 10:45:00.355868 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8fa56cf-2051-47fc-9ac0-d28653185bd6-secret-volume\") pod \"collect-profiles-29414085-xg6bl\" (UID: \"e8fa56cf-2051-47fc-9ac0-d28653185bd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl" Dec 04 10:45:00 crc kubenswrapper[4693]: I1204 10:45:00.376155 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2m7z\" (UniqueName: \"kubernetes.io/projected/e8fa56cf-2051-47fc-9ac0-d28653185bd6-kube-api-access-c2m7z\") pod \"collect-profiles-29414085-xg6bl\" (UID: \"e8fa56cf-2051-47fc-9ac0-d28653185bd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl" Dec 04 10:45:00 crc kubenswrapper[4693]: I1204 10:45:00.539396 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tctjr"] Dec 04 10:45:00 crc kubenswrapper[4693]: I1204 10:45:00.546148 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl" Dec 04 10:45:01 crc kubenswrapper[4693]: I1204 10:45:01.041222 4693 generic.go:334] "Generic (PLEG): container finished" podID="7ee8418e-9378-4fa2-8f9c-802898c06e25" containerID="99842dce7febb2106072ccd63f673d890e037e8787ee10b817f9aa563ffc9ccb" exitCode=0 Dec 04 10:45:01 crc kubenswrapper[4693]: I1204 10:45:01.041579 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tctjr" event={"ID":"7ee8418e-9378-4fa2-8f9c-802898c06e25","Type":"ContainerDied","Data":"99842dce7febb2106072ccd63f673d890e037e8787ee10b817f9aa563ffc9ccb"} Dec 04 10:45:01 crc kubenswrapper[4693]: I1204 10:45:01.041638 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tctjr" event={"ID":"7ee8418e-9378-4fa2-8f9c-802898c06e25","Type":"ContainerStarted","Data":"67b85a04300a9b1d45c1c666c16450ac964337272b5308368df6c76351508eb2"} Dec 04 10:45:01 crc kubenswrapper[4693]: I1204 10:45:01.043841 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:45:01 crc kubenswrapper[4693]: I1204 10:45:01.115032 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl"] Dec 04 10:45:02 crc kubenswrapper[4693]: I1204 10:45:02.069125 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tctjr" event={"ID":"7ee8418e-9378-4fa2-8f9c-802898c06e25","Type":"ContainerStarted","Data":"b5f0c804a84ffd68bd66360144735abedffe8cd6b246bbf6a71fa27e1f9df729"} Dec 04 10:45:02 crc kubenswrapper[4693]: I1204 10:45:02.074092 4693 generic.go:334] "Generic (PLEG): container finished" podID="e8fa56cf-2051-47fc-9ac0-d28653185bd6" containerID="f550bb027b26b10e52780da2ed9403c4fdb175c0c625d2c8146d2d07800a46a3" exitCode=0 Dec 04 10:45:02 crc kubenswrapper[4693]: I1204 10:45:02.074146 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl" event={"ID":"e8fa56cf-2051-47fc-9ac0-d28653185bd6","Type":"ContainerDied","Data":"f550bb027b26b10e52780da2ed9403c4fdb175c0c625d2c8146d2d07800a46a3"} Dec 04 10:45:02 crc kubenswrapper[4693]: I1204 10:45:02.074178 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl" event={"ID":"e8fa56cf-2051-47fc-9ac0-d28653185bd6","Type":"ContainerStarted","Data":"5c9ca85850111a9c5b960e9cbd3c8b9677a6d969b169418d8e49d72b5e0823bb"} Dec 04 10:45:03 crc kubenswrapper[4693]: I1204 10:45:03.655917 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl" Dec 04 10:45:03 crc kubenswrapper[4693]: I1204 10:45:03.731918 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8fa56cf-2051-47fc-9ac0-d28653185bd6-secret-volume\") pod \"e8fa56cf-2051-47fc-9ac0-d28653185bd6\" (UID: \"e8fa56cf-2051-47fc-9ac0-d28653185bd6\") " Dec 04 10:45:03 crc kubenswrapper[4693]: I1204 10:45:03.733067 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2m7z\" (UniqueName: \"kubernetes.io/projected/e8fa56cf-2051-47fc-9ac0-d28653185bd6-kube-api-access-c2m7z\") pod \"e8fa56cf-2051-47fc-9ac0-d28653185bd6\" (UID: \"e8fa56cf-2051-47fc-9ac0-d28653185bd6\") " Dec 04 10:45:03 crc kubenswrapper[4693]: I1204 10:45:03.733149 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8fa56cf-2051-47fc-9ac0-d28653185bd6-config-volume\") pod \"e8fa56cf-2051-47fc-9ac0-d28653185bd6\" (UID: \"e8fa56cf-2051-47fc-9ac0-d28653185bd6\") " Dec 04 10:45:03 crc kubenswrapper[4693]: I1204 10:45:03.733768 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8fa56cf-2051-47fc-9ac0-d28653185bd6-config-volume" (OuterVolumeSpecName: "config-volume") pod "e8fa56cf-2051-47fc-9ac0-d28653185bd6" (UID: "e8fa56cf-2051-47fc-9ac0-d28653185bd6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:45:03 crc kubenswrapper[4693]: I1204 10:45:03.734056 4693 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8fa56cf-2051-47fc-9ac0-d28653185bd6-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:45:03 crc kubenswrapper[4693]: I1204 10:45:03.739885 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8fa56cf-2051-47fc-9ac0-d28653185bd6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e8fa56cf-2051-47fc-9ac0-d28653185bd6" (UID: "e8fa56cf-2051-47fc-9ac0-d28653185bd6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:45:03 crc kubenswrapper[4693]: I1204 10:45:03.739962 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8fa56cf-2051-47fc-9ac0-d28653185bd6-kube-api-access-c2m7z" (OuterVolumeSpecName: "kube-api-access-c2m7z") pod "e8fa56cf-2051-47fc-9ac0-d28653185bd6" (UID: "e8fa56cf-2051-47fc-9ac0-d28653185bd6"). InnerVolumeSpecName "kube-api-access-c2m7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:45:03 crc kubenswrapper[4693]: I1204 10:45:03.836537 4693 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8fa56cf-2051-47fc-9ac0-d28653185bd6-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:45:03 crc kubenswrapper[4693]: I1204 10:45:03.836576 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2m7z\" (UniqueName: \"kubernetes.io/projected/e8fa56cf-2051-47fc-9ac0-d28653185bd6-kube-api-access-c2m7z\") on node \"crc\" DevicePath \"\"" Dec 04 10:45:04 crc kubenswrapper[4693]: I1204 10:45:04.092483 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl" event={"ID":"e8fa56cf-2051-47fc-9ac0-d28653185bd6","Type":"ContainerDied","Data":"5c9ca85850111a9c5b960e9cbd3c8b9677a6d969b169418d8e49d72b5e0823bb"} Dec 04 10:45:04 crc kubenswrapper[4693]: I1204 10:45:04.092789 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c9ca85850111a9c5b960e9cbd3c8b9677a6d969b169418d8e49d72b5e0823bb" Dec 04 10:45:04 crc kubenswrapper[4693]: I1204 10:45:04.092530 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl" Dec 04 10:45:05 crc kubenswrapper[4693]: I1204 10:45:05.002052 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn"] Dec 04 10:45:05 crc kubenswrapper[4693]: I1204 10:45:05.011600 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414040-grzzn"] Dec 04 10:45:06 crc kubenswrapper[4693]: I1204 10:45:06.114074 4693 generic.go:334] "Generic (PLEG): container finished" podID="7ee8418e-9378-4fa2-8f9c-802898c06e25" containerID="b5f0c804a84ffd68bd66360144735abedffe8cd6b246bbf6a71fa27e1f9df729" exitCode=0 Dec 04 10:45:06 crc kubenswrapper[4693]: I1204 10:45:06.114457 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tctjr" event={"ID":"7ee8418e-9378-4fa2-8f9c-802898c06e25","Type":"ContainerDied","Data":"b5f0c804a84ffd68bd66360144735abedffe8cd6b246bbf6a71fa27e1f9df729"} Dec 04 10:45:06 crc kubenswrapper[4693]: I1204 10:45:06.490144 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="274f76aa-470f-4b21-830f-650997de36c3" path="/var/lib/kubelet/pods/274f76aa-470f-4b21-830f-650997de36c3/volumes" Dec 04 10:45:07 crc kubenswrapper[4693]: I1204 10:45:07.125250 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tctjr" event={"ID":"7ee8418e-9378-4fa2-8f9c-802898c06e25","Type":"ContainerStarted","Data":"7529a3602d705c12d0e7a1c4ebf663290f0e23b439e5b8537c72a0624c894f8b"} Dec 04 10:45:07 crc kubenswrapper[4693]: I1204 10:45:07.150230 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tctjr" podStartSLOduration=2.675957081 podStartE2EDuration="8.150210052s" podCreationTimestamp="2025-12-04 10:44:59 +0000 UTC" firstStartedPulling="2025-12-04 10:45:01.043451556 +0000 UTC m=+3746.941045309" lastFinishedPulling="2025-12-04 10:45:06.517704537 +0000 UTC m=+3752.415298280" observedRunningTime="2025-12-04 10:45:07.14451436 +0000 UTC m=+3753.042108133" watchObservedRunningTime="2025-12-04 10:45:07.150210052 +0000 UTC m=+3753.047803805" Dec 04 10:45:09 crc kubenswrapper[4693]: I1204 10:45:09.983222 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tctjr" Dec 04 10:45:09 crc kubenswrapper[4693]: I1204 10:45:09.983864 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tctjr" Dec 04 10:45:11 crc kubenswrapper[4693]: I1204 10:45:11.031253 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tctjr" podUID="7ee8418e-9378-4fa2-8f9c-802898c06e25" containerName="registry-server" probeResult="failure" output=< Dec 04 10:45:11 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Dec 04 10:45:11 crc kubenswrapper[4693]: > Dec 04 10:45:11 crc kubenswrapper[4693]: I1204 10:45:11.040273 4693 scope.go:117] "RemoveContainer" containerID="b1ae4eb4959369467400a7cc229bba38ce351e14d6587d8fa83b7f336a4d63b0" Dec 04 10:45:20 crc kubenswrapper[4693]: I1204 10:45:20.032117 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tctjr" Dec 04 10:45:20 crc kubenswrapper[4693]: I1204 10:45:20.081774 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tctjr" Dec 04 10:45:20 crc kubenswrapper[4693]: I1204 10:45:20.286887 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tctjr"] Dec 04 10:45:21 crc kubenswrapper[4693]: I1204 10:45:21.297235 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tctjr" podUID="7ee8418e-9378-4fa2-8f9c-802898c06e25" containerName="registry-server" containerID="cri-o://7529a3602d705c12d0e7a1c4ebf663290f0e23b439e5b8537c72a0624c894f8b" gracePeriod=2 Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.027197 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tctjr" Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.135646 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ee8418e-9378-4fa2-8f9c-802898c06e25-utilities\") pod \"7ee8418e-9378-4fa2-8f9c-802898c06e25\" (UID: \"7ee8418e-9378-4fa2-8f9c-802898c06e25\") " Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.135789 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqmbr\" (UniqueName: \"kubernetes.io/projected/7ee8418e-9378-4fa2-8f9c-802898c06e25-kube-api-access-nqmbr\") pod \"7ee8418e-9378-4fa2-8f9c-802898c06e25\" (UID: \"7ee8418e-9378-4fa2-8f9c-802898c06e25\") " Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.135862 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ee8418e-9378-4fa2-8f9c-802898c06e25-catalog-content\") pod \"7ee8418e-9378-4fa2-8f9c-802898c06e25\" (UID: \"7ee8418e-9378-4fa2-8f9c-802898c06e25\") " Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.136507 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ee8418e-9378-4fa2-8f9c-802898c06e25-utilities" (OuterVolumeSpecName: "utilities") pod "7ee8418e-9378-4fa2-8f9c-802898c06e25" (UID: "7ee8418e-9378-4fa2-8f9c-802898c06e25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.145529 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ee8418e-9378-4fa2-8f9c-802898c06e25-kube-api-access-nqmbr" (OuterVolumeSpecName: "kube-api-access-nqmbr") pod "7ee8418e-9378-4fa2-8f9c-802898c06e25" (UID: "7ee8418e-9378-4fa2-8f9c-802898c06e25"). InnerVolumeSpecName "kube-api-access-nqmbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.239531 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ee8418e-9378-4fa2-8f9c-802898c06e25-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.239561 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqmbr\" (UniqueName: \"kubernetes.io/projected/7ee8418e-9378-4fa2-8f9c-802898c06e25-kube-api-access-nqmbr\") on node \"crc\" DevicePath \"\"" Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.261970 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ee8418e-9378-4fa2-8f9c-802898c06e25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ee8418e-9378-4fa2-8f9c-802898c06e25" (UID: "7ee8418e-9378-4fa2-8f9c-802898c06e25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.272981 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.273069 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.308749 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tctjr" Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.308856 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tctjr" event={"ID":"7ee8418e-9378-4fa2-8f9c-802898c06e25","Type":"ContainerDied","Data":"7529a3602d705c12d0e7a1c4ebf663290f0e23b439e5b8537c72a0624c894f8b"} Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.308922 4693 scope.go:117] "RemoveContainer" containerID="7529a3602d705c12d0e7a1c4ebf663290f0e23b439e5b8537c72a0624c894f8b" Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.310675 4693 generic.go:334] "Generic (PLEG): container finished" podID="7ee8418e-9378-4fa2-8f9c-802898c06e25" containerID="7529a3602d705c12d0e7a1c4ebf663290f0e23b439e5b8537c72a0624c894f8b" exitCode=0 Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.310738 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tctjr" event={"ID":"7ee8418e-9378-4fa2-8f9c-802898c06e25","Type":"ContainerDied","Data":"67b85a04300a9b1d45c1c666c16450ac964337272b5308368df6c76351508eb2"} Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.330159 4693 scope.go:117] "RemoveContainer" containerID="b5f0c804a84ffd68bd66360144735abedffe8cd6b246bbf6a71fa27e1f9df729" Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.352031 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ee8418e-9378-4fa2-8f9c-802898c06e25-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.355449 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tctjr"] Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.363286 4693 scope.go:117] "RemoveContainer" containerID="99842dce7febb2106072ccd63f673d890e037e8787ee10b817f9aa563ffc9ccb" Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.365954 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tctjr"] Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.408840 4693 scope.go:117] "RemoveContainer" containerID="7529a3602d705c12d0e7a1c4ebf663290f0e23b439e5b8537c72a0624c894f8b" Dec 04 10:45:22 crc kubenswrapper[4693]: E1204 10:45:22.409486 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7529a3602d705c12d0e7a1c4ebf663290f0e23b439e5b8537c72a0624c894f8b\": container with ID starting with 7529a3602d705c12d0e7a1c4ebf663290f0e23b439e5b8537c72a0624c894f8b not found: ID does not exist" containerID="7529a3602d705c12d0e7a1c4ebf663290f0e23b439e5b8537c72a0624c894f8b" Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.409545 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7529a3602d705c12d0e7a1c4ebf663290f0e23b439e5b8537c72a0624c894f8b"} err="failed to get container status \"7529a3602d705c12d0e7a1c4ebf663290f0e23b439e5b8537c72a0624c894f8b\": rpc error: code = NotFound desc = could not find container \"7529a3602d705c12d0e7a1c4ebf663290f0e23b439e5b8537c72a0624c894f8b\": container with ID starting with 7529a3602d705c12d0e7a1c4ebf663290f0e23b439e5b8537c72a0624c894f8b not found: ID does not exist" Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.409572 4693 scope.go:117] "RemoveContainer" containerID="b5f0c804a84ffd68bd66360144735abedffe8cd6b246bbf6a71fa27e1f9df729" Dec 04 10:45:22 crc kubenswrapper[4693]: E1204 10:45:22.409849 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5f0c804a84ffd68bd66360144735abedffe8cd6b246bbf6a71fa27e1f9df729\": container with ID starting with b5f0c804a84ffd68bd66360144735abedffe8cd6b246bbf6a71fa27e1f9df729 not found: ID does not exist" containerID="b5f0c804a84ffd68bd66360144735abedffe8cd6b246bbf6a71fa27e1f9df729" Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.409875 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5f0c804a84ffd68bd66360144735abedffe8cd6b246bbf6a71fa27e1f9df729"} err="failed to get container status \"b5f0c804a84ffd68bd66360144735abedffe8cd6b246bbf6a71fa27e1f9df729\": rpc error: code = NotFound desc = could not find container \"b5f0c804a84ffd68bd66360144735abedffe8cd6b246bbf6a71fa27e1f9df729\": container with ID starting with b5f0c804a84ffd68bd66360144735abedffe8cd6b246bbf6a71fa27e1f9df729 not found: ID does not exist" Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.409888 4693 scope.go:117] "RemoveContainer" containerID="99842dce7febb2106072ccd63f673d890e037e8787ee10b817f9aa563ffc9ccb" Dec 04 10:45:22 crc kubenswrapper[4693]: E1204 10:45:22.410133 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99842dce7febb2106072ccd63f673d890e037e8787ee10b817f9aa563ffc9ccb\": container with ID starting with 99842dce7febb2106072ccd63f673d890e037e8787ee10b817f9aa563ffc9ccb not found: ID does not exist" containerID="99842dce7febb2106072ccd63f673d890e037e8787ee10b817f9aa563ffc9ccb" Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.410180 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99842dce7febb2106072ccd63f673d890e037e8787ee10b817f9aa563ffc9ccb"} err="failed to get container status \"99842dce7febb2106072ccd63f673d890e037e8787ee10b817f9aa563ffc9ccb\": rpc error: code = NotFound desc = could not find container \"99842dce7febb2106072ccd63f673d890e037e8787ee10b817f9aa563ffc9ccb\": container with ID starting with 99842dce7febb2106072ccd63f673d890e037e8787ee10b817f9aa563ffc9ccb not found: ID does not exist" Dec 04 10:45:22 crc kubenswrapper[4693]: I1204 10:45:22.472821 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ee8418e-9378-4fa2-8f9c-802898c06e25" path="/var/lib/kubelet/pods/7ee8418e-9378-4fa2-8f9c-802898c06e25/volumes" Dec 04 10:45:52 crc kubenswrapper[4693]: I1204 10:45:52.273046 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:45:52 crc kubenswrapper[4693]: I1204 10:45:52.273592 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:45:52 crc kubenswrapper[4693]: I1204 10:45:52.273653 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 10:45:52 crc kubenswrapper[4693]: I1204 10:45:52.274640 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:45:52 crc kubenswrapper[4693]: I1204 10:45:52.274708 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" gracePeriod=600 Dec 04 10:45:52 crc kubenswrapper[4693]: E1204 10:45:52.393754 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:45:52 crc kubenswrapper[4693]: I1204 10:45:52.696141 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" exitCode=0 Dec 04 10:45:52 crc kubenswrapper[4693]: I1204 10:45:52.696188 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a"} Dec 04 10:45:52 crc kubenswrapper[4693]: I1204 10:45:52.696245 4693 scope.go:117] "RemoveContainer" containerID="44d11ce84a57f06206b908e6d45309b9aa24ca82e2e438a566c1f7e6bf0f7fc8" Dec 04 10:45:52 crc kubenswrapper[4693]: I1204 10:45:52.696977 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:45:52 crc kubenswrapper[4693]: E1204 10:45:52.697292 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:46:05 crc kubenswrapper[4693]: I1204 10:46:05.461626 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:46:05 crc kubenswrapper[4693]: E1204 10:46:05.462686 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:46:16 crc kubenswrapper[4693]: I1204 10:46:16.461184 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:46:16 crc kubenswrapper[4693]: E1204 10:46:16.462268 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:46:28 crc kubenswrapper[4693]: I1204 10:46:28.461412 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:46:28 crc kubenswrapper[4693]: E1204 10:46:28.462369 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:46:41 crc kubenswrapper[4693]: I1204 10:46:41.461584 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:46:41 crc kubenswrapper[4693]: E1204 10:46:41.462613 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:46:53 crc kubenswrapper[4693]: I1204 10:46:53.461638 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:46:53 crc kubenswrapper[4693]: E1204 10:46:53.463256 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:47:06 crc kubenswrapper[4693]: I1204 10:47:06.461036 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:47:06 crc kubenswrapper[4693]: E1204 10:47:06.461783 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:47:21 crc kubenswrapper[4693]: I1204 10:47:21.461702 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:47:21 crc kubenswrapper[4693]: E1204 10:47:21.462536 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:47:32 crc kubenswrapper[4693]: I1204 10:47:32.461139 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:47:32 crc kubenswrapper[4693]: E1204 10:47:32.461984 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:47:47 crc kubenswrapper[4693]: I1204 10:47:47.462003 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:47:47 crc kubenswrapper[4693]: E1204 10:47:47.462719 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:47:58 crc kubenswrapper[4693]: I1204 10:47:58.461653 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:47:58 crc kubenswrapper[4693]: E1204 10:47:58.462492 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:48:09 crc kubenswrapper[4693]: I1204 10:48:09.460962 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:48:09 crc kubenswrapper[4693]: E1204 10:48:09.461744 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:48:14 crc kubenswrapper[4693]: I1204 10:48:14.797315 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t9lmv"] Dec 04 10:48:14 crc kubenswrapper[4693]: E1204 10:48:14.798424 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ee8418e-9378-4fa2-8f9c-802898c06e25" containerName="extract-content" Dec 04 10:48:14 crc kubenswrapper[4693]: I1204 10:48:14.798441 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee8418e-9378-4fa2-8f9c-802898c06e25" containerName="extract-content" Dec 04 10:48:14 crc kubenswrapper[4693]: E1204 10:48:14.798456 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8fa56cf-2051-47fc-9ac0-d28653185bd6" containerName="collect-profiles" Dec 04 10:48:14 crc kubenswrapper[4693]: I1204 10:48:14.798465 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8fa56cf-2051-47fc-9ac0-d28653185bd6" containerName="collect-profiles" Dec 04 10:48:14 crc kubenswrapper[4693]: E1204 10:48:14.798487 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ee8418e-9378-4fa2-8f9c-802898c06e25" containerName="registry-server" Dec 04 10:48:14 crc kubenswrapper[4693]: I1204 10:48:14.798495 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee8418e-9378-4fa2-8f9c-802898c06e25" containerName="registry-server" Dec 04 10:48:14 crc kubenswrapper[4693]: E1204 10:48:14.798508 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ee8418e-9378-4fa2-8f9c-802898c06e25" containerName="extract-utilities" Dec 04 10:48:14 crc kubenswrapper[4693]: I1204 10:48:14.798515 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee8418e-9378-4fa2-8f9c-802898c06e25" containerName="extract-utilities" Dec 04 10:48:14 crc kubenswrapper[4693]: I1204 10:48:14.798769 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8fa56cf-2051-47fc-9ac0-d28653185bd6" containerName="collect-profiles" Dec 04 10:48:14 crc kubenswrapper[4693]: I1204 10:48:14.798801 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ee8418e-9378-4fa2-8f9c-802898c06e25" containerName="registry-server" Dec 04 10:48:14 crc kubenswrapper[4693]: I1204 10:48:14.800601 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t9lmv" Dec 04 10:48:14 crc kubenswrapper[4693]: I1204 10:48:14.811106 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9lmv"] Dec 04 10:48:14 crc kubenswrapper[4693]: I1204 10:48:14.979733 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6jfm\" (UniqueName: \"kubernetes.io/projected/1752f4d7-a46c-48ae-9120-53ea05857491-kube-api-access-k6jfm\") pod \"redhat-marketplace-t9lmv\" (UID: \"1752f4d7-a46c-48ae-9120-53ea05857491\") " pod="openshift-marketplace/redhat-marketplace-t9lmv" Dec 04 10:48:14 crc kubenswrapper[4693]: I1204 10:48:14.980609 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1752f4d7-a46c-48ae-9120-53ea05857491-utilities\") pod \"redhat-marketplace-t9lmv\" (UID: \"1752f4d7-a46c-48ae-9120-53ea05857491\") " pod="openshift-marketplace/redhat-marketplace-t9lmv" Dec 04 10:48:14 crc kubenswrapper[4693]: I1204 10:48:14.980789 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1752f4d7-a46c-48ae-9120-53ea05857491-catalog-content\") pod \"redhat-marketplace-t9lmv\" (UID: \"1752f4d7-a46c-48ae-9120-53ea05857491\") " pod="openshift-marketplace/redhat-marketplace-t9lmv" Dec 04 10:48:15 crc kubenswrapper[4693]: I1204 10:48:15.083206 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1752f4d7-a46c-48ae-9120-53ea05857491-utilities\") pod \"redhat-marketplace-t9lmv\" (UID: \"1752f4d7-a46c-48ae-9120-53ea05857491\") " pod="openshift-marketplace/redhat-marketplace-t9lmv" Dec 04 10:48:15 crc kubenswrapper[4693]: I1204 10:48:15.083289 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1752f4d7-a46c-48ae-9120-53ea05857491-catalog-content\") pod \"redhat-marketplace-t9lmv\" (UID: \"1752f4d7-a46c-48ae-9120-53ea05857491\") " pod="openshift-marketplace/redhat-marketplace-t9lmv" Dec 04 10:48:15 crc kubenswrapper[4693]: I1204 10:48:15.083435 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6jfm\" (UniqueName: \"kubernetes.io/projected/1752f4d7-a46c-48ae-9120-53ea05857491-kube-api-access-k6jfm\") pod \"redhat-marketplace-t9lmv\" (UID: \"1752f4d7-a46c-48ae-9120-53ea05857491\") " pod="openshift-marketplace/redhat-marketplace-t9lmv" Dec 04 10:48:15 crc kubenswrapper[4693]: I1204 10:48:15.083844 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1752f4d7-a46c-48ae-9120-53ea05857491-utilities\") pod \"redhat-marketplace-t9lmv\" (UID: \"1752f4d7-a46c-48ae-9120-53ea05857491\") " pod="openshift-marketplace/redhat-marketplace-t9lmv" Dec 04 10:48:15 crc kubenswrapper[4693]: I1204 10:48:15.083877 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1752f4d7-a46c-48ae-9120-53ea05857491-catalog-content\") pod \"redhat-marketplace-t9lmv\" (UID: \"1752f4d7-a46c-48ae-9120-53ea05857491\") " pod="openshift-marketplace/redhat-marketplace-t9lmv" Dec 04 10:48:15 crc kubenswrapper[4693]: I1204 10:48:15.103958 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6jfm\" (UniqueName: \"kubernetes.io/projected/1752f4d7-a46c-48ae-9120-53ea05857491-kube-api-access-k6jfm\") pod \"redhat-marketplace-t9lmv\" (UID: \"1752f4d7-a46c-48ae-9120-53ea05857491\") " pod="openshift-marketplace/redhat-marketplace-t9lmv" Dec 04 10:48:15 crc kubenswrapper[4693]: I1204 10:48:15.160882 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t9lmv" Dec 04 10:48:15 crc kubenswrapper[4693]: I1204 10:48:15.629889 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9lmv"] Dec 04 10:48:16 crc kubenswrapper[4693]: I1204 10:48:16.075322 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9lmv" event={"ID":"1752f4d7-a46c-48ae-9120-53ea05857491","Type":"ContainerStarted","Data":"0a50edfc5ca56c9aa84e06b84ed9a09039ed0dc854c53bf4e6ddca5cd3ec812a"} Dec 04 10:48:17 crc kubenswrapper[4693]: I1204 10:48:17.086795 4693 generic.go:334] "Generic (PLEG): container finished" podID="1752f4d7-a46c-48ae-9120-53ea05857491" containerID="b28d56082049df867d8ca59f29b32559476679ec68473f01bde0b4a9dd84773d" exitCode=0 Dec 04 10:48:17 crc kubenswrapper[4693]: I1204 10:48:17.086979 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9lmv" event={"ID":"1752f4d7-a46c-48ae-9120-53ea05857491","Type":"ContainerDied","Data":"b28d56082049df867d8ca59f29b32559476679ec68473f01bde0b4a9dd84773d"} Dec 04 10:48:18 crc kubenswrapper[4693]: I1204 10:48:18.098287 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9lmv" event={"ID":"1752f4d7-a46c-48ae-9120-53ea05857491","Type":"ContainerStarted","Data":"ec63aec17710730511d85baff727ffaae44b8fe03bec57e8bc9155fb2a192358"} Dec 04 10:48:19 crc kubenswrapper[4693]: I1204 10:48:19.109147 4693 generic.go:334] "Generic (PLEG): container finished" podID="1752f4d7-a46c-48ae-9120-53ea05857491" containerID="ec63aec17710730511d85baff727ffaae44b8fe03bec57e8bc9155fb2a192358" exitCode=0 Dec 04 10:48:19 crc kubenswrapper[4693]: I1204 10:48:19.109207 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9lmv" event={"ID":"1752f4d7-a46c-48ae-9120-53ea05857491","Type":"ContainerDied","Data":"ec63aec17710730511d85baff727ffaae44b8fe03bec57e8bc9155fb2a192358"} Dec 04 10:48:20 crc kubenswrapper[4693]: I1204 10:48:20.121178 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9lmv" event={"ID":"1752f4d7-a46c-48ae-9120-53ea05857491","Type":"ContainerStarted","Data":"9b99a3178baf87d047bea2264d03bb604dba2dc07f90e5c22938c17f881a5f9d"} Dec 04 10:48:20 crc kubenswrapper[4693]: I1204 10:48:20.147897 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t9lmv" podStartSLOduration=3.528706652 podStartE2EDuration="6.147878362s" podCreationTimestamp="2025-12-04 10:48:14 +0000 UTC" firstStartedPulling="2025-12-04 10:48:17.090877831 +0000 UTC m=+3942.988471584" lastFinishedPulling="2025-12-04 10:48:19.710049541 +0000 UTC m=+3945.607643294" observedRunningTime="2025-12-04 10:48:20.139823276 +0000 UTC m=+3946.037417029" watchObservedRunningTime="2025-12-04 10:48:20.147878362 +0000 UTC m=+3946.045472115" Dec 04 10:48:24 crc kubenswrapper[4693]: I1204 10:48:24.468716 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:48:24 crc kubenswrapper[4693]: E1204 10:48:24.469786 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:48:25 crc kubenswrapper[4693]: I1204 10:48:25.161092 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t9lmv" Dec 04 10:48:25 crc kubenswrapper[4693]: I1204 10:48:25.161403 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t9lmv" Dec 04 10:48:25 crc kubenswrapper[4693]: I1204 10:48:25.403928 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t9lmv" Dec 04 10:48:26 crc kubenswrapper[4693]: I1204 10:48:26.225436 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t9lmv" Dec 04 10:48:26 crc kubenswrapper[4693]: I1204 10:48:26.278317 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9lmv"] Dec 04 10:48:28 crc kubenswrapper[4693]: I1204 10:48:28.184983 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t9lmv" podUID="1752f4d7-a46c-48ae-9120-53ea05857491" containerName="registry-server" containerID="cri-o://9b99a3178baf87d047bea2264d03bb604dba2dc07f90e5c22938c17f881a5f9d" gracePeriod=2 Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.130203 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t9lmv" Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.195523 4693 generic.go:334] "Generic (PLEG): container finished" podID="1752f4d7-a46c-48ae-9120-53ea05857491" containerID="9b99a3178baf87d047bea2264d03bb604dba2dc07f90e5c22938c17f881a5f9d" exitCode=0 Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.195571 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9lmv" event={"ID":"1752f4d7-a46c-48ae-9120-53ea05857491","Type":"ContainerDied","Data":"9b99a3178baf87d047bea2264d03bb604dba2dc07f90e5c22938c17f881a5f9d"} Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.195623 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9lmv" event={"ID":"1752f4d7-a46c-48ae-9120-53ea05857491","Type":"ContainerDied","Data":"0a50edfc5ca56c9aa84e06b84ed9a09039ed0dc854c53bf4e6ddca5cd3ec812a"} Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.195648 4693 scope.go:117] "RemoveContainer" containerID="9b99a3178baf87d047bea2264d03bb604dba2dc07f90e5c22938c17f881a5f9d" Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.195649 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t9lmv" Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.221785 4693 scope.go:117] "RemoveContainer" containerID="ec63aec17710730511d85baff727ffaae44b8fe03bec57e8bc9155fb2a192358" Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.224899 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1752f4d7-a46c-48ae-9120-53ea05857491-catalog-content\") pod \"1752f4d7-a46c-48ae-9120-53ea05857491\" (UID: \"1752f4d7-a46c-48ae-9120-53ea05857491\") " Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.225022 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6jfm\" (UniqueName: \"kubernetes.io/projected/1752f4d7-a46c-48ae-9120-53ea05857491-kube-api-access-k6jfm\") pod \"1752f4d7-a46c-48ae-9120-53ea05857491\" (UID: \"1752f4d7-a46c-48ae-9120-53ea05857491\") " Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.225113 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1752f4d7-a46c-48ae-9120-53ea05857491-utilities\") pod \"1752f4d7-a46c-48ae-9120-53ea05857491\" (UID: \"1752f4d7-a46c-48ae-9120-53ea05857491\") " Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.226554 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1752f4d7-a46c-48ae-9120-53ea05857491-utilities" (OuterVolumeSpecName: "utilities") pod "1752f4d7-a46c-48ae-9120-53ea05857491" (UID: "1752f4d7-a46c-48ae-9120-53ea05857491"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.231699 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1752f4d7-a46c-48ae-9120-53ea05857491-kube-api-access-k6jfm" (OuterVolumeSpecName: "kube-api-access-k6jfm") pod "1752f4d7-a46c-48ae-9120-53ea05857491" (UID: "1752f4d7-a46c-48ae-9120-53ea05857491"). InnerVolumeSpecName "kube-api-access-k6jfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.246803 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1752f4d7-a46c-48ae-9120-53ea05857491-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1752f4d7-a46c-48ae-9120-53ea05857491" (UID: "1752f4d7-a46c-48ae-9120-53ea05857491"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.247145 4693 scope.go:117] "RemoveContainer" containerID="b28d56082049df867d8ca59f29b32559476679ec68473f01bde0b4a9dd84773d" Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.328087 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6jfm\" (UniqueName: \"kubernetes.io/projected/1752f4d7-a46c-48ae-9120-53ea05857491-kube-api-access-k6jfm\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.328131 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1752f4d7-a46c-48ae-9120-53ea05857491-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.328145 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1752f4d7-a46c-48ae-9120-53ea05857491-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.347992 4693 scope.go:117] "RemoveContainer" containerID="9b99a3178baf87d047bea2264d03bb604dba2dc07f90e5c22938c17f881a5f9d" Dec 04 10:48:29 crc kubenswrapper[4693]: E1204 10:48:29.348605 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b99a3178baf87d047bea2264d03bb604dba2dc07f90e5c22938c17f881a5f9d\": container with ID starting with 9b99a3178baf87d047bea2264d03bb604dba2dc07f90e5c22938c17f881a5f9d not found: ID does not exist" containerID="9b99a3178baf87d047bea2264d03bb604dba2dc07f90e5c22938c17f881a5f9d" Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.348655 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b99a3178baf87d047bea2264d03bb604dba2dc07f90e5c22938c17f881a5f9d"} err="failed to get container status \"9b99a3178baf87d047bea2264d03bb604dba2dc07f90e5c22938c17f881a5f9d\": rpc error: code = NotFound desc = could not find container \"9b99a3178baf87d047bea2264d03bb604dba2dc07f90e5c22938c17f881a5f9d\": container with ID starting with 9b99a3178baf87d047bea2264d03bb604dba2dc07f90e5c22938c17f881a5f9d not found: ID does not exist" Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.348684 4693 scope.go:117] "RemoveContainer" containerID="ec63aec17710730511d85baff727ffaae44b8fe03bec57e8bc9155fb2a192358" Dec 04 10:48:29 crc kubenswrapper[4693]: E1204 10:48:29.349168 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec63aec17710730511d85baff727ffaae44b8fe03bec57e8bc9155fb2a192358\": container with ID starting with ec63aec17710730511d85baff727ffaae44b8fe03bec57e8bc9155fb2a192358 not found: ID does not exist" containerID="ec63aec17710730511d85baff727ffaae44b8fe03bec57e8bc9155fb2a192358" Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.349220 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec63aec17710730511d85baff727ffaae44b8fe03bec57e8bc9155fb2a192358"} err="failed to get container status \"ec63aec17710730511d85baff727ffaae44b8fe03bec57e8bc9155fb2a192358\": rpc error: code = NotFound desc = could not find container \"ec63aec17710730511d85baff727ffaae44b8fe03bec57e8bc9155fb2a192358\": container with ID starting with ec63aec17710730511d85baff727ffaae44b8fe03bec57e8bc9155fb2a192358 not found: ID does not exist" Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.349249 4693 scope.go:117] "RemoveContainer" containerID="b28d56082049df867d8ca59f29b32559476679ec68473f01bde0b4a9dd84773d" Dec 04 10:48:29 crc kubenswrapper[4693]: E1204 10:48:29.350657 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b28d56082049df867d8ca59f29b32559476679ec68473f01bde0b4a9dd84773d\": container with ID starting with b28d56082049df867d8ca59f29b32559476679ec68473f01bde0b4a9dd84773d not found: ID does not exist" containerID="b28d56082049df867d8ca59f29b32559476679ec68473f01bde0b4a9dd84773d" Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.350691 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b28d56082049df867d8ca59f29b32559476679ec68473f01bde0b4a9dd84773d"} err="failed to get container status \"b28d56082049df867d8ca59f29b32559476679ec68473f01bde0b4a9dd84773d\": rpc error: code = NotFound desc = could not find container \"b28d56082049df867d8ca59f29b32559476679ec68473f01bde0b4a9dd84773d\": container with ID starting with b28d56082049df867d8ca59f29b32559476679ec68473f01bde0b4a9dd84773d not found: ID does not exist" Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.569072 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9lmv"] Dec 04 10:48:29 crc kubenswrapper[4693]: I1204 10:48:29.579273 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9lmv"] Dec 04 10:48:30 crc kubenswrapper[4693]: I1204 10:48:30.473861 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1752f4d7-a46c-48ae-9120-53ea05857491" path="/var/lib/kubelet/pods/1752f4d7-a46c-48ae-9120-53ea05857491/volumes" Dec 04 10:48:35 crc kubenswrapper[4693]: I1204 10:48:35.462280 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:48:35 crc kubenswrapper[4693]: E1204 10:48:35.463121 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:48:47 crc kubenswrapper[4693]: I1204 10:48:47.462042 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:48:47 crc kubenswrapper[4693]: E1204 10:48:47.463001 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:49:02 crc kubenswrapper[4693]: I1204 10:49:02.465047 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:49:02 crc kubenswrapper[4693]: E1204 10:49:02.467191 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:49:17 crc kubenswrapper[4693]: I1204 10:49:17.462073 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:49:17 crc kubenswrapper[4693]: E1204 10:49:17.462990 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:49:31 crc kubenswrapper[4693]: I1204 10:49:31.461149 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:49:31 crc kubenswrapper[4693]: E1204 10:49:31.461848 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:49:42 crc kubenswrapper[4693]: I1204 10:49:42.461684 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:49:42 crc kubenswrapper[4693]: E1204 10:49:42.462457 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:49:53 crc kubenswrapper[4693]: I1204 10:49:53.461016 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:49:53 crc kubenswrapper[4693]: E1204 10:49:53.461698 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:50:06 crc kubenswrapper[4693]: I1204 10:50:06.462123 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:50:06 crc kubenswrapper[4693]: E1204 10:50:06.463014 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:50:21 crc kubenswrapper[4693]: I1204 10:50:21.461210 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:50:21 crc kubenswrapper[4693]: E1204 10:50:21.463374 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:50:35 crc kubenswrapper[4693]: I1204 10:50:35.461774 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:50:35 crc kubenswrapper[4693]: E1204 10:50:35.462779 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:50:50 crc kubenswrapper[4693]: I1204 10:50:50.463293 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:50:50 crc kubenswrapper[4693]: E1204 10:50:50.464046 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:51:03 crc kubenswrapper[4693]: I1204 10:51:03.461919 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:51:04 crc kubenswrapper[4693]: I1204 10:51:04.642199 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"fe10638fb663763e0faff0a79722de2da773bc9f1713bf195673135eae12a858"} Dec 04 10:51:47 crc kubenswrapper[4693]: I1204 10:51:47.580115 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4zz7q"] Dec 04 10:51:47 crc kubenswrapper[4693]: E1204 10:51:47.580979 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1752f4d7-a46c-48ae-9120-53ea05857491" containerName="extract-content" Dec 04 10:51:47 crc kubenswrapper[4693]: I1204 10:51:47.580998 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="1752f4d7-a46c-48ae-9120-53ea05857491" containerName="extract-content" Dec 04 10:51:47 crc kubenswrapper[4693]: E1204 10:51:47.581025 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1752f4d7-a46c-48ae-9120-53ea05857491" containerName="registry-server" Dec 04 10:51:47 crc kubenswrapper[4693]: I1204 10:51:47.581031 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="1752f4d7-a46c-48ae-9120-53ea05857491" containerName="registry-server" Dec 04 10:51:47 crc kubenswrapper[4693]: E1204 10:51:47.581046 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1752f4d7-a46c-48ae-9120-53ea05857491" containerName="extract-utilities" Dec 04 10:51:47 crc kubenswrapper[4693]: I1204 10:51:47.581052 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="1752f4d7-a46c-48ae-9120-53ea05857491" containerName="extract-utilities" Dec 04 10:51:47 crc kubenswrapper[4693]: I1204 10:51:47.581234 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="1752f4d7-a46c-48ae-9120-53ea05857491" containerName="registry-server" Dec 04 10:51:47 crc kubenswrapper[4693]: I1204 10:51:47.582615 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4zz7q" Dec 04 10:51:47 crc kubenswrapper[4693]: I1204 10:51:47.597097 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4zz7q"] Dec 04 10:51:47 crc kubenswrapper[4693]: I1204 10:51:47.707715 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9ghm\" (UniqueName: \"kubernetes.io/projected/7e714ffd-c8b1-4134-9414-9568e2528da1-kube-api-access-n9ghm\") pod \"community-operators-4zz7q\" (UID: \"7e714ffd-c8b1-4134-9414-9568e2528da1\") " pod="openshift-marketplace/community-operators-4zz7q" Dec 04 10:51:47 crc kubenswrapper[4693]: I1204 10:51:47.707822 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e714ffd-c8b1-4134-9414-9568e2528da1-catalog-content\") pod \"community-operators-4zz7q\" (UID: \"7e714ffd-c8b1-4134-9414-9568e2528da1\") " pod="openshift-marketplace/community-operators-4zz7q" Dec 04 10:51:47 crc kubenswrapper[4693]: I1204 10:51:47.707891 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e714ffd-c8b1-4134-9414-9568e2528da1-utilities\") pod \"community-operators-4zz7q\" (UID: \"7e714ffd-c8b1-4134-9414-9568e2528da1\") " pod="openshift-marketplace/community-operators-4zz7q" Dec 04 10:51:47 crc kubenswrapper[4693]: I1204 10:51:47.810293 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9ghm\" (UniqueName: \"kubernetes.io/projected/7e714ffd-c8b1-4134-9414-9568e2528da1-kube-api-access-n9ghm\") pod \"community-operators-4zz7q\" (UID: \"7e714ffd-c8b1-4134-9414-9568e2528da1\") " pod="openshift-marketplace/community-operators-4zz7q" Dec 04 10:51:47 crc kubenswrapper[4693]: I1204 10:51:47.810750 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e714ffd-c8b1-4134-9414-9568e2528da1-catalog-content\") pod \"community-operators-4zz7q\" (UID: \"7e714ffd-c8b1-4134-9414-9568e2528da1\") " pod="openshift-marketplace/community-operators-4zz7q" Dec 04 10:51:47 crc kubenswrapper[4693]: I1204 10:51:47.810841 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e714ffd-c8b1-4134-9414-9568e2528da1-utilities\") pod \"community-operators-4zz7q\" (UID: \"7e714ffd-c8b1-4134-9414-9568e2528da1\") " pod="openshift-marketplace/community-operators-4zz7q" Dec 04 10:51:47 crc kubenswrapper[4693]: I1204 10:51:47.811466 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e714ffd-c8b1-4134-9414-9568e2528da1-catalog-content\") pod \"community-operators-4zz7q\" (UID: \"7e714ffd-c8b1-4134-9414-9568e2528da1\") " pod="openshift-marketplace/community-operators-4zz7q" Dec 04 10:51:47 crc kubenswrapper[4693]: I1204 10:51:47.840391 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e714ffd-c8b1-4134-9414-9568e2528da1-utilities\") pod \"community-operators-4zz7q\" (UID: \"7e714ffd-c8b1-4134-9414-9568e2528da1\") " pod="openshift-marketplace/community-operators-4zz7q" Dec 04 10:51:47 crc kubenswrapper[4693]: I1204 10:51:47.859740 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9ghm\" (UniqueName: \"kubernetes.io/projected/7e714ffd-c8b1-4134-9414-9568e2528da1-kube-api-access-n9ghm\") pod \"community-operators-4zz7q\" (UID: \"7e714ffd-c8b1-4134-9414-9568e2528da1\") " pod="openshift-marketplace/community-operators-4zz7q" Dec 04 10:51:47 crc kubenswrapper[4693]: I1204 10:51:47.953224 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4zz7q" Dec 04 10:51:48 crc kubenswrapper[4693]: I1204 10:51:48.601299 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4zz7q"] Dec 04 10:51:48 crc kubenswrapper[4693]: W1204 10:51:48.614710 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e714ffd_c8b1_4134_9414_9568e2528da1.slice/crio-bb00268c5451f75d5ffb9b5a0c23295cbc379b2cbbb971d51e77366bbee6bc9e WatchSource:0}: Error finding container bb00268c5451f75d5ffb9b5a0c23295cbc379b2cbbb971d51e77366bbee6bc9e: Status 404 returned error can't find the container with id bb00268c5451f75d5ffb9b5a0c23295cbc379b2cbbb971d51e77366bbee6bc9e Dec 04 10:51:49 crc kubenswrapper[4693]: I1204 10:51:49.047219 4693 generic.go:334] "Generic (PLEG): container finished" podID="7e714ffd-c8b1-4134-9414-9568e2528da1" containerID="a6ddfdac557bb5e65d165910a513cd63656f76494a4656c1343844cc6c1bd05e" exitCode=0 Dec 04 10:51:49 crc kubenswrapper[4693]: I1204 10:51:49.047417 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zz7q" event={"ID":"7e714ffd-c8b1-4134-9414-9568e2528da1","Type":"ContainerDied","Data":"a6ddfdac557bb5e65d165910a513cd63656f76494a4656c1343844cc6c1bd05e"} Dec 04 10:51:49 crc kubenswrapper[4693]: I1204 10:51:49.047581 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zz7q" event={"ID":"7e714ffd-c8b1-4134-9414-9568e2528da1","Type":"ContainerStarted","Data":"bb00268c5451f75d5ffb9b5a0c23295cbc379b2cbbb971d51e77366bbee6bc9e"} Dec 04 10:51:49 crc kubenswrapper[4693]: I1204 10:51:49.050425 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:51:50 crc kubenswrapper[4693]: I1204 10:51:50.059937 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zz7q" event={"ID":"7e714ffd-c8b1-4134-9414-9568e2528da1","Type":"ContainerStarted","Data":"c14096119e041cbf2495057eba9b3d92eb6cafb076e3b40fbbb718f1ba1a0812"} Dec 04 10:51:51 crc kubenswrapper[4693]: I1204 10:51:51.076659 4693 generic.go:334] "Generic (PLEG): container finished" podID="7e714ffd-c8b1-4134-9414-9568e2528da1" containerID="c14096119e041cbf2495057eba9b3d92eb6cafb076e3b40fbbb718f1ba1a0812" exitCode=0 Dec 04 10:51:51 crc kubenswrapper[4693]: I1204 10:51:51.076723 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zz7q" event={"ID":"7e714ffd-c8b1-4134-9414-9568e2528da1","Type":"ContainerDied","Data":"c14096119e041cbf2495057eba9b3d92eb6cafb076e3b40fbbb718f1ba1a0812"} Dec 04 10:51:52 crc kubenswrapper[4693]: I1204 10:51:52.088524 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zz7q" event={"ID":"7e714ffd-c8b1-4134-9414-9568e2528da1","Type":"ContainerStarted","Data":"4e334f8e7928af4cd26bea2a21fe4fe9fbb34132fe6d8c4fdc6d1a082261d598"} Dec 04 10:51:52 crc kubenswrapper[4693]: I1204 10:51:52.116682 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4zz7q" podStartSLOduration=2.6632679379999997 podStartE2EDuration="5.116661387s" podCreationTimestamp="2025-12-04 10:51:47 +0000 UTC" firstStartedPulling="2025-12-04 10:51:49.050135162 +0000 UTC m=+4154.947728915" lastFinishedPulling="2025-12-04 10:51:51.503528601 +0000 UTC m=+4157.401122364" observedRunningTime="2025-12-04 10:51:52.112430144 +0000 UTC m=+4158.010023897" watchObservedRunningTime="2025-12-04 10:51:52.116661387 +0000 UTC m=+4158.014255140" Dec 04 10:51:57 crc kubenswrapper[4693]: I1204 10:51:57.953888 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4zz7q" Dec 04 10:51:57 crc kubenswrapper[4693]: I1204 10:51:57.954849 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4zz7q" Dec 04 10:51:58 crc kubenswrapper[4693]: I1204 10:51:58.000853 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4zz7q" Dec 04 10:51:58 crc kubenswrapper[4693]: I1204 10:51:58.178203 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4zz7q" Dec 04 10:51:58 crc kubenswrapper[4693]: I1204 10:51:58.232720 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4zz7q"] Dec 04 10:52:00 crc kubenswrapper[4693]: I1204 10:52:00.152192 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4zz7q" podUID="7e714ffd-c8b1-4134-9414-9568e2528da1" containerName="registry-server" containerID="cri-o://4e334f8e7928af4cd26bea2a21fe4fe9fbb34132fe6d8c4fdc6d1a082261d598" gracePeriod=2 Dec 04 10:52:01 crc kubenswrapper[4693]: I1204 10:52:01.163186 4693 generic.go:334] "Generic (PLEG): container finished" podID="7e714ffd-c8b1-4134-9414-9568e2528da1" containerID="4e334f8e7928af4cd26bea2a21fe4fe9fbb34132fe6d8c4fdc6d1a082261d598" exitCode=0 Dec 04 10:52:01 crc kubenswrapper[4693]: I1204 10:52:01.163308 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zz7q" event={"ID":"7e714ffd-c8b1-4134-9414-9568e2528da1","Type":"ContainerDied","Data":"4e334f8e7928af4cd26bea2a21fe4fe9fbb34132fe6d8c4fdc6d1a082261d598"} Dec 04 10:52:01 crc kubenswrapper[4693]: I1204 10:52:01.425212 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4zz7q" Dec 04 10:52:01 crc kubenswrapper[4693]: I1204 10:52:01.508866 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e714ffd-c8b1-4134-9414-9568e2528da1-utilities\") pod \"7e714ffd-c8b1-4134-9414-9568e2528da1\" (UID: \"7e714ffd-c8b1-4134-9414-9568e2528da1\") " Dec 04 10:52:01 crc kubenswrapper[4693]: I1204 10:52:01.508950 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e714ffd-c8b1-4134-9414-9568e2528da1-catalog-content\") pod \"7e714ffd-c8b1-4134-9414-9568e2528da1\" (UID: \"7e714ffd-c8b1-4134-9414-9568e2528da1\") " Dec 04 10:52:01 crc kubenswrapper[4693]: I1204 10:52:01.509088 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9ghm\" (UniqueName: \"kubernetes.io/projected/7e714ffd-c8b1-4134-9414-9568e2528da1-kube-api-access-n9ghm\") pod \"7e714ffd-c8b1-4134-9414-9568e2528da1\" (UID: \"7e714ffd-c8b1-4134-9414-9568e2528da1\") " Dec 04 10:52:01 crc kubenswrapper[4693]: I1204 10:52:01.509741 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e714ffd-c8b1-4134-9414-9568e2528da1-utilities" (OuterVolumeSpecName: "utilities") pod "7e714ffd-c8b1-4134-9414-9568e2528da1" (UID: "7e714ffd-c8b1-4134-9414-9568e2528da1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:52:01 crc kubenswrapper[4693]: I1204 10:52:01.520946 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e714ffd-c8b1-4134-9414-9568e2528da1-kube-api-access-n9ghm" (OuterVolumeSpecName: "kube-api-access-n9ghm") pod "7e714ffd-c8b1-4134-9414-9568e2528da1" (UID: "7e714ffd-c8b1-4134-9414-9568e2528da1"). InnerVolumeSpecName "kube-api-access-n9ghm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:52:01 crc kubenswrapper[4693]: I1204 10:52:01.569657 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e714ffd-c8b1-4134-9414-9568e2528da1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e714ffd-c8b1-4134-9414-9568e2528da1" (UID: "7e714ffd-c8b1-4134-9414-9568e2528da1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:52:01 crc kubenswrapper[4693]: I1204 10:52:01.611189 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e714ffd-c8b1-4134-9414-9568e2528da1-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:52:01 crc kubenswrapper[4693]: I1204 10:52:01.611228 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e714ffd-c8b1-4134-9414-9568e2528da1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:52:01 crc kubenswrapper[4693]: I1204 10:52:01.611244 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9ghm\" (UniqueName: \"kubernetes.io/projected/7e714ffd-c8b1-4134-9414-9568e2528da1-kube-api-access-n9ghm\") on node \"crc\" DevicePath \"\"" Dec 04 10:52:02 crc kubenswrapper[4693]: I1204 10:52:02.174177 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zz7q" event={"ID":"7e714ffd-c8b1-4134-9414-9568e2528da1","Type":"ContainerDied","Data":"bb00268c5451f75d5ffb9b5a0c23295cbc379b2cbbb971d51e77366bbee6bc9e"} Dec 04 10:52:02 crc kubenswrapper[4693]: I1204 10:52:02.174247 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4zz7q" Dec 04 10:52:02 crc kubenswrapper[4693]: I1204 10:52:02.174505 4693 scope.go:117] "RemoveContainer" containerID="4e334f8e7928af4cd26bea2a21fe4fe9fbb34132fe6d8c4fdc6d1a082261d598" Dec 04 10:52:02 crc kubenswrapper[4693]: I1204 10:52:02.206607 4693 scope.go:117] "RemoveContainer" containerID="c14096119e041cbf2495057eba9b3d92eb6cafb076e3b40fbbb718f1ba1a0812" Dec 04 10:52:02 crc kubenswrapper[4693]: I1204 10:52:02.227475 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4zz7q"] Dec 04 10:52:02 crc kubenswrapper[4693]: I1204 10:52:02.249731 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4zz7q"] Dec 04 10:52:02 crc kubenswrapper[4693]: I1204 10:52:02.260746 4693 scope.go:117] "RemoveContainer" containerID="a6ddfdac557bb5e65d165910a513cd63656f76494a4656c1343844cc6c1bd05e" Dec 04 10:52:02 crc kubenswrapper[4693]: I1204 10:52:02.471808 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e714ffd-c8b1-4134-9414-9568e2528da1" path="/var/lib/kubelet/pods/7e714ffd-c8b1-4134-9414-9568e2528da1/volumes" Dec 04 10:53:22 crc kubenswrapper[4693]: I1204 10:53:22.273421 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:53:22 crc kubenswrapper[4693]: I1204 10:53:22.273936 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:53:52 crc kubenswrapper[4693]: I1204 10:53:52.273052 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:53:52 crc kubenswrapper[4693]: I1204 10:53:52.273606 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:54:22 crc kubenswrapper[4693]: I1204 10:54:22.272671 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:54:22 crc kubenswrapper[4693]: I1204 10:54:22.274630 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:54:22 crc kubenswrapper[4693]: I1204 10:54:22.274775 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 10:54:22 crc kubenswrapper[4693]: I1204 10:54:22.275602 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fe10638fb663763e0faff0a79722de2da773bc9f1713bf195673135eae12a858"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:54:22 crc kubenswrapper[4693]: I1204 10:54:22.275727 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://fe10638fb663763e0faff0a79722de2da773bc9f1713bf195673135eae12a858" gracePeriod=600 Dec 04 10:54:22 crc kubenswrapper[4693]: I1204 10:54:22.445903 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="fe10638fb663763e0faff0a79722de2da773bc9f1713bf195673135eae12a858" exitCode=0 Dec 04 10:54:22 crc kubenswrapper[4693]: I1204 10:54:22.445968 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"fe10638fb663763e0faff0a79722de2da773bc9f1713bf195673135eae12a858"} Dec 04 10:54:22 crc kubenswrapper[4693]: I1204 10:54:22.446752 4693 scope.go:117] "RemoveContainer" containerID="48fac3fc68bc373fe809a9760a52fe963e8dba724c42edbd08216bcd202a4f7a" Dec 04 10:54:23 crc kubenswrapper[4693]: I1204 10:54:23.456844 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d"} Dec 04 10:56:22 crc kubenswrapper[4693]: I1204 10:56:22.273430 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:56:22 crc kubenswrapper[4693]: I1204 10:56:22.273980 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:56:30 crc kubenswrapper[4693]: E1204 10:56:30.036716 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Dec 04 10:56:44 crc kubenswrapper[4693]: I1204 10:56:44.891427 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t8xdb"] Dec 04 10:56:44 crc kubenswrapper[4693]: E1204 10:56:44.892291 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e714ffd-c8b1-4134-9414-9568e2528da1" containerName="extract-content" Dec 04 10:56:44 crc kubenswrapper[4693]: I1204 10:56:44.892302 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e714ffd-c8b1-4134-9414-9568e2528da1" containerName="extract-content" Dec 04 10:56:44 crc kubenswrapper[4693]: E1204 10:56:44.892314 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e714ffd-c8b1-4134-9414-9568e2528da1" containerName="extract-utilities" Dec 04 10:56:44 crc kubenswrapper[4693]: I1204 10:56:44.892320 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e714ffd-c8b1-4134-9414-9568e2528da1" containerName="extract-utilities" Dec 04 10:56:44 crc kubenswrapper[4693]: E1204 10:56:44.892353 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e714ffd-c8b1-4134-9414-9568e2528da1" containerName="registry-server" Dec 04 10:56:44 crc kubenswrapper[4693]: I1204 10:56:44.892359 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e714ffd-c8b1-4134-9414-9568e2528da1" containerName="registry-server" Dec 04 10:56:44 crc kubenswrapper[4693]: I1204 10:56:44.892528 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e714ffd-c8b1-4134-9414-9568e2528da1" containerName="registry-server" Dec 04 10:56:44 crc kubenswrapper[4693]: I1204 10:56:44.893823 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t8xdb" Dec 04 10:56:44 crc kubenswrapper[4693]: I1204 10:56:44.968364 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t8xdb"] Dec 04 10:56:45 crc kubenswrapper[4693]: I1204 10:56:45.038058 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8280e8f5-d660-4200-8762-04c377358cb5-catalog-content\") pod \"redhat-operators-t8xdb\" (UID: \"8280e8f5-d660-4200-8762-04c377358cb5\") " pod="openshift-marketplace/redhat-operators-t8xdb" Dec 04 10:56:45 crc kubenswrapper[4693]: I1204 10:56:45.038140 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8280e8f5-d660-4200-8762-04c377358cb5-utilities\") pod \"redhat-operators-t8xdb\" (UID: \"8280e8f5-d660-4200-8762-04c377358cb5\") " pod="openshift-marketplace/redhat-operators-t8xdb" Dec 04 10:56:45 crc kubenswrapper[4693]: I1204 10:56:45.038204 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvgv5\" (UniqueName: \"kubernetes.io/projected/8280e8f5-d660-4200-8762-04c377358cb5-kube-api-access-lvgv5\") pod \"redhat-operators-t8xdb\" (UID: \"8280e8f5-d660-4200-8762-04c377358cb5\") " pod="openshift-marketplace/redhat-operators-t8xdb" Dec 04 10:56:45 crc kubenswrapper[4693]: I1204 10:56:45.140281 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8280e8f5-d660-4200-8762-04c377358cb5-catalog-content\") pod \"redhat-operators-t8xdb\" (UID: \"8280e8f5-d660-4200-8762-04c377358cb5\") " pod="openshift-marketplace/redhat-operators-t8xdb" Dec 04 10:56:45 crc kubenswrapper[4693]: I1204 10:56:45.140703 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8280e8f5-d660-4200-8762-04c377358cb5-utilities\") pod \"redhat-operators-t8xdb\" (UID: \"8280e8f5-d660-4200-8762-04c377358cb5\") " pod="openshift-marketplace/redhat-operators-t8xdb" Dec 04 10:56:45 crc kubenswrapper[4693]: I1204 10:56:45.140761 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvgv5\" (UniqueName: \"kubernetes.io/projected/8280e8f5-d660-4200-8762-04c377358cb5-kube-api-access-lvgv5\") pod \"redhat-operators-t8xdb\" (UID: \"8280e8f5-d660-4200-8762-04c377358cb5\") " pod="openshift-marketplace/redhat-operators-t8xdb" Dec 04 10:56:45 crc kubenswrapper[4693]: I1204 10:56:45.141077 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8280e8f5-d660-4200-8762-04c377358cb5-catalog-content\") pod \"redhat-operators-t8xdb\" (UID: \"8280e8f5-d660-4200-8762-04c377358cb5\") " pod="openshift-marketplace/redhat-operators-t8xdb" Dec 04 10:56:45 crc kubenswrapper[4693]: I1204 10:56:45.141232 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8280e8f5-d660-4200-8762-04c377358cb5-utilities\") pod \"redhat-operators-t8xdb\" (UID: \"8280e8f5-d660-4200-8762-04c377358cb5\") " pod="openshift-marketplace/redhat-operators-t8xdb" Dec 04 10:56:45 crc kubenswrapper[4693]: I1204 10:56:45.162965 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvgv5\" (UniqueName: \"kubernetes.io/projected/8280e8f5-d660-4200-8762-04c377358cb5-kube-api-access-lvgv5\") pod \"redhat-operators-t8xdb\" (UID: \"8280e8f5-d660-4200-8762-04c377358cb5\") " pod="openshift-marketplace/redhat-operators-t8xdb" Dec 04 10:56:45 crc kubenswrapper[4693]: I1204 10:56:45.271787 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t8xdb" Dec 04 10:56:45 crc kubenswrapper[4693]: I1204 10:56:45.868007 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t8xdb"] Dec 04 10:56:46 crc kubenswrapper[4693]: I1204 10:56:46.758888 4693 generic.go:334] "Generic (PLEG): container finished" podID="8280e8f5-d660-4200-8762-04c377358cb5" containerID="7b280fd1b3b6fb417e12fc55485533854a4da07437b29f4b5b140e9614f1d157" exitCode=0 Dec 04 10:56:46 crc kubenswrapper[4693]: I1204 10:56:46.760197 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t8xdb" event={"ID":"8280e8f5-d660-4200-8762-04c377358cb5","Type":"ContainerDied","Data":"7b280fd1b3b6fb417e12fc55485533854a4da07437b29f4b5b140e9614f1d157"} Dec 04 10:56:46 crc kubenswrapper[4693]: I1204 10:56:46.760255 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t8xdb" event={"ID":"8280e8f5-d660-4200-8762-04c377358cb5","Type":"ContainerStarted","Data":"41f430a50ec0398c2d84c2f8b94e28fe3219918416dea80e6c27630c0e69115d"} Dec 04 10:56:47 crc kubenswrapper[4693]: I1204 10:56:47.768539 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t8xdb" event={"ID":"8280e8f5-d660-4200-8762-04c377358cb5","Type":"ContainerStarted","Data":"a77f7b52340e9d8725338d0def4635ecd5be8f9e807176442844761972d55ea0"} Dec 04 10:56:50 crc kubenswrapper[4693]: I1204 10:56:50.796216 4693 generic.go:334] "Generic (PLEG): container finished" podID="8280e8f5-d660-4200-8762-04c377358cb5" containerID="a77f7b52340e9d8725338d0def4635ecd5be8f9e807176442844761972d55ea0" exitCode=0 Dec 04 10:56:50 crc kubenswrapper[4693]: I1204 10:56:50.796252 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t8xdb" event={"ID":"8280e8f5-d660-4200-8762-04c377358cb5","Type":"ContainerDied","Data":"a77f7b52340e9d8725338d0def4635ecd5be8f9e807176442844761972d55ea0"} Dec 04 10:56:50 crc kubenswrapper[4693]: I1204 10:56:50.799847 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:56:51 crc kubenswrapper[4693]: I1204 10:56:51.812992 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t8xdb" event={"ID":"8280e8f5-d660-4200-8762-04c377358cb5","Type":"ContainerStarted","Data":"84795a13744cde80d388e32edd67facad9215b0f6481de4b8867c73b5d9293e1"} Dec 04 10:56:51 crc kubenswrapper[4693]: I1204 10:56:51.833677 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t8xdb" podStartSLOduration=3.406654524 podStartE2EDuration="7.833654337s" podCreationTimestamp="2025-12-04 10:56:44 +0000 UTC" firstStartedPulling="2025-12-04 10:56:46.761611661 +0000 UTC m=+4452.659205414" lastFinishedPulling="2025-12-04 10:56:51.188611474 +0000 UTC m=+4457.086205227" observedRunningTime="2025-12-04 10:56:51.830945014 +0000 UTC m=+4457.728538767" watchObservedRunningTime="2025-12-04 10:56:51.833654337 +0000 UTC m=+4457.731248120" Dec 04 10:56:52 crc kubenswrapper[4693]: I1204 10:56:52.272738 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:56:52 crc kubenswrapper[4693]: I1204 10:56:52.273113 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:56:55 crc kubenswrapper[4693]: I1204 10:56:55.272339 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t8xdb" Dec 04 10:56:55 crc kubenswrapper[4693]: I1204 10:56:55.272887 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t8xdb" Dec 04 10:56:56 crc kubenswrapper[4693]: I1204 10:56:56.327000 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t8xdb" podUID="8280e8f5-d660-4200-8762-04c377358cb5" containerName="registry-server" probeResult="failure" output=< Dec 04 10:56:56 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Dec 04 10:56:56 crc kubenswrapper[4693]: > Dec 04 10:56:56 crc kubenswrapper[4693]: I1204 10:56:56.854819 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jlcz5"] Dec 04 10:56:56 crc kubenswrapper[4693]: I1204 10:56:56.857106 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jlcz5" Dec 04 10:56:56 crc kubenswrapper[4693]: I1204 10:56:56.869624 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jlcz5"] Dec 04 10:56:56 crc kubenswrapper[4693]: I1204 10:56:56.973824 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd2d4e91-6885-4639-903c-99ab41cb86a7-catalog-content\") pod \"certified-operators-jlcz5\" (UID: \"fd2d4e91-6885-4639-903c-99ab41cb86a7\") " pod="openshift-marketplace/certified-operators-jlcz5" Dec 04 10:56:56 crc kubenswrapper[4693]: I1204 10:56:56.973992 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9clx\" (UniqueName: \"kubernetes.io/projected/fd2d4e91-6885-4639-903c-99ab41cb86a7-kube-api-access-l9clx\") pod \"certified-operators-jlcz5\" (UID: \"fd2d4e91-6885-4639-903c-99ab41cb86a7\") " pod="openshift-marketplace/certified-operators-jlcz5" Dec 04 10:56:56 crc kubenswrapper[4693]: I1204 10:56:56.974014 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd2d4e91-6885-4639-903c-99ab41cb86a7-utilities\") pod \"certified-operators-jlcz5\" (UID: \"fd2d4e91-6885-4639-903c-99ab41cb86a7\") " pod="openshift-marketplace/certified-operators-jlcz5" Dec 04 10:56:57 crc kubenswrapper[4693]: I1204 10:56:57.075974 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9clx\" (UniqueName: \"kubernetes.io/projected/fd2d4e91-6885-4639-903c-99ab41cb86a7-kube-api-access-l9clx\") pod \"certified-operators-jlcz5\" (UID: \"fd2d4e91-6885-4639-903c-99ab41cb86a7\") " pod="openshift-marketplace/certified-operators-jlcz5" Dec 04 10:56:57 crc kubenswrapper[4693]: I1204 10:56:57.076037 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd2d4e91-6885-4639-903c-99ab41cb86a7-utilities\") pod \"certified-operators-jlcz5\" (UID: \"fd2d4e91-6885-4639-903c-99ab41cb86a7\") " pod="openshift-marketplace/certified-operators-jlcz5" Dec 04 10:56:57 crc kubenswrapper[4693]: I1204 10:56:57.076181 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd2d4e91-6885-4639-903c-99ab41cb86a7-catalog-content\") pod \"certified-operators-jlcz5\" (UID: \"fd2d4e91-6885-4639-903c-99ab41cb86a7\") " pod="openshift-marketplace/certified-operators-jlcz5" Dec 04 10:56:57 crc kubenswrapper[4693]: I1204 10:56:57.077194 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd2d4e91-6885-4639-903c-99ab41cb86a7-catalog-content\") pod \"certified-operators-jlcz5\" (UID: \"fd2d4e91-6885-4639-903c-99ab41cb86a7\") " pod="openshift-marketplace/certified-operators-jlcz5" Dec 04 10:56:57 crc kubenswrapper[4693]: I1204 10:56:57.077755 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd2d4e91-6885-4639-903c-99ab41cb86a7-utilities\") pod \"certified-operators-jlcz5\" (UID: \"fd2d4e91-6885-4639-903c-99ab41cb86a7\") " pod="openshift-marketplace/certified-operators-jlcz5" Dec 04 10:56:57 crc kubenswrapper[4693]: I1204 10:56:57.118465 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9clx\" (UniqueName: \"kubernetes.io/projected/fd2d4e91-6885-4639-903c-99ab41cb86a7-kube-api-access-l9clx\") pod \"certified-operators-jlcz5\" (UID: \"fd2d4e91-6885-4639-903c-99ab41cb86a7\") " pod="openshift-marketplace/certified-operators-jlcz5" Dec 04 10:56:57 crc kubenswrapper[4693]: I1204 10:56:57.176403 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jlcz5" Dec 04 10:56:57 crc kubenswrapper[4693]: I1204 10:56:57.800066 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jlcz5"] Dec 04 10:56:57 crc kubenswrapper[4693]: I1204 10:56:57.886811 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jlcz5" event={"ID":"fd2d4e91-6885-4639-903c-99ab41cb86a7","Type":"ContainerStarted","Data":"6f1368ab178354ffe3e94cf0e0878f118400d6de1fcf1874e7d095424163703f"} Dec 04 10:56:58 crc kubenswrapper[4693]: I1204 10:56:58.897638 4693 generic.go:334] "Generic (PLEG): container finished" podID="fd2d4e91-6885-4639-903c-99ab41cb86a7" containerID="5f0990094bd17407ede0babf59766a14b4af6bf3a2ece13a5583ed6ce478f928" exitCode=0 Dec 04 10:56:58 crc kubenswrapper[4693]: I1204 10:56:58.897729 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jlcz5" event={"ID":"fd2d4e91-6885-4639-903c-99ab41cb86a7","Type":"ContainerDied","Data":"5f0990094bd17407ede0babf59766a14b4af6bf3a2ece13a5583ed6ce478f928"} Dec 04 10:56:59 crc kubenswrapper[4693]: I1204 10:56:59.910576 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jlcz5" event={"ID":"fd2d4e91-6885-4639-903c-99ab41cb86a7","Type":"ContainerStarted","Data":"be37b23226974f60cae8c16a148f05022fc70f60a51de7bf540624f84c5cde22"} Dec 04 10:57:00 crc kubenswrapper[4693]: E1204 10:57:00.869882 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd2d4e91_6885_4639_903c_99ab41cb86a7.slice/crio-be37b23226974f60cae8c16a148f05022fc70f60a51de7bf540624f84c5cde22.scope\": RecentStats: unable to find data in memory cache]" Dec 04 10:57:01 crc kubenswrapper[4693]: I1204 10:57:01.932737 4693 generic.go:334] "Generic (PLEG): container finished" podID="fd2d4e91-6885-4639-903c-99ab41cb86a7" containerID="be37b23226974f60cae8c16a148f05022fc70f60a51de7bf540624f84c5cde22" exitCode=0 Dec 04 10:57:01 crc kubenswrapper[4693]: I1204 10:57:01.932796 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jlcz5" event={"ID":"fd2d4e91-6885-4639-903c-99ab41cb86a7","Type":"ContainerDied","Data":"be37b23226974f60cae8c16a148f05022fc70f60a51de7bf540624f84c5cde22"} Dec 04 10:57:03 crc kubenswrapper[4693]: I1204 10:57:02.966200 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jlcz5" event={"ID":"fd2d4e91-6885-4639-903c-99ab41cb86a7","Type":"ContainerStarted","Data":"4e75d4732a7491bccd9e8aaf10a61267d2fcaf089ca780dd6ee7c35c9dcfd72a"} Dec 04 10:57:03 crc kubenswrapper[4693]: I1204 10:57:03.013992 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jlcz5" podStartSLOduration=3.383382711 podStartE2EDuration="7.01397605s" podCreationTimestamp="2025-12-04 10:56:56 +0000 UTC" firstStartedPulling="2025-12-04 10:56:58.899732166 +0000 UTC m=+4464.797325909" lastFinishedPulling="2025-12-04 10:57:02.530325495 +0000 UTC m=+4468.427919248" observedRunningTime="2025-12-04 10:57:03.0136079 +0000 UTC m=+4468.911201663" watchObservedRunningTime="2025-12-04 10:57:03.01397605 +0000 UTC m=+4468.911569793" Dec 04 10:57:05 crc kubenswrapper[4693]: I1204 10:57:05.326385 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t8xdb" Dec 04 10:57:05 crc kubenswrapper[4693]: I1204 10:57:05.379927 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t8xdb" Dec 04 10:57:05 crc kubenswrapper[4693]: I1204 10:57:05.561709 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t8xdb"] Dec 04 10:57:06 crc kubenswrapper[4693]: I1204 10:57:06.996853 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t8xdb" podUID="8280e8f5-d660-4200-8762-04c377358cb5" containerName="registry-server" containerID="cri-o://84795a13744cde80d388e32edd67facad9215b0f6481de4b8867c73b5d9293e1" gracePeriod=2 Dec 04 10:57:07 crc kubenswrapper[4693]: I1204 10:57:07.176578 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jlcz5" Dec 04 10:57:07 crc kubenswrapper[4693]: I1204 10:57:07.176913 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jlcz5" Dec 04 10:57:07 crc kubenswrapper[4693]: I1204 10:57:07.251893 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jlcz5" Dec 04 10:57:07 crc kubenswrapper[4693]: I1204 10:57:07.669840 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t8xdb" Dec 04 10:57:07 crc kubenswrapper[4693]: I1204 10:57:07.811026 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8280e8f5-d660-4200-8762-04c377358cb5-catalog-content\") pod \"8280e8f5-d660-4200-8762-04c377358cb5\" (UID: \"8280e8f5-d660-4200-8762-04c377358cb5\") " Dec 04 10:57:07 crc kubenswrapper[4693]: I1204 10:57:07.811102 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8280e8f5-d660-4200-8762-04c377358cb5-utilities\") pod \"8280e8f5-d660-4200-8762-04c377358cb5\" (UID: \"8280e8f5-d660-4200-8762-04c377358cb5\") " Dec 04 10:57:07 crc kubenswrapper[4693]: I1204 10:57:07.811744 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8280e8f5-d660-4200-8762-04c377358cb5-utilities" (OuterVolumeSpecName: "utilities") pod "8280e8f5-d660-4200-8762-04c377358cb5" (UID: "8280e8f5-d660-4200-8762-04c377358cb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:57:07 crc kubenswrapper[4693]: I1204 10:57:07.811916 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvgv5\" (UniqueName: \"kubernetes.io/projected/8280e8f5-d660-4200-8762-04c377358cb5-kube-api-access-lvgv5\") pod \"8280e8f5-d660-4200-8762-04c377358cb5\" (UID: \"8280e8f5-d660-4200-8762-04c377358cb5\") " Dec 04 10:57:07 crc kubenswrapper[4693]: I1204 10:57:07.813831 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8280e8f5-d660-4200-8762-04c377358cb5-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:57:07 crc kubenswrapper[4693]: I1204 10:57:07.834217 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8280e8f5-d660-4200-8762-04c377358cb5-kube-api-access-lvgv5" (OuterVolumeSpecName: "kube-api-access-lvgv5") pod "8280e8f5-d660-4200-8762-04c377358cb5" (UID: "8280e8f5-d660-4200-8762-04c377358cb5"). InnerVolumeSpecName "kube-api-access-lvgv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:57:07 crc kubenswrapper[4693]: I1204 10:57:07.916131 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvgv5\" (UniqueName: \"kubernetes.io/projected/8280e8f5-d660-4200-8762-04c377358cb5-kube-api-access-lvgv5\") on node \"crc\" DevicePath \"\"" Dec 04 10:57:07 crc kubenswrapper[4693]: I1204 10:57:07.916631 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8280e8f5-d660-4200-8762-04c377358cb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8280e8f5-d660-4200-8762-04c377358cb5" (UID: "8280e8f5-d660-4200-8762-04c377358cb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:57:08 crc kubenswrapper[4693]: I1204 10:57:08.007233 4693 generic.go:334] "Generic (PLEG): container finished" podID="8280e8f5-d660-4200-8762-04c377358cb5" containerID="84795a13744cde80d388e32edd67facad9215b0f6481de4b8867c73b5d9293e1" exitCode=0 Dec 04 10:57:08 crc kubenswrapper[4693]: I1204 10:57:08.008137 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t8xdb" Dec 04 10:57:08 crc kubenswrapper[4693]: I1204 10:57:08.009462 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t8xdb" event={"ID":"8280e8f5-d660-4200-8762-04c377358cb5","Type":"ContainerDied","Data":"84795a13744cde80d388e32edd67facad9215b0f6481de4b8867c73b5d9293e1"} Dec 04 10:57:08 crc kubenswrapper[4693]: I1204 10:57:08.009538 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t8xdb" event={"ID":"8280e8f5-d660-4200-8762-04c377358cb5","Type":"ContainerDied","Data":"41f430a50ec0398c2d84c2f8b94e28fe3219918416dea80e6c27630c0e69115d"} Dec 04 10:57:08 crc kubenswrapper[4693]: I1204 10:57:08.009556 4693 scope.go:117] "RemoveContainer" containerID="84795a13744cde80d388e32edd67facad9215b0f6481de4b8867c73b5d9293e1" Dec 04 10:57:08 crc kubenswrapper[4693]: I1204 10:57:08.018526 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8280e8f5-d660-4200-8762-04c377358cb5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:57:08 crc kubenswrapper[4693]: I1204 10:57:08.041399 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t8xdb"] Dec 04 10:57:08 crc kubenswrapper[4693]: I1204 10:57:08.042443 4693 scope.go:117] "RemoveContainer" containerID="a77f7b52340e9d8725338d0def4635ecd5be8f9e807176442844761972d55ea0" Dec 04 10:57:08 crc kubenswrapper[4693]: I1204 10:57:08.049055 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t8xdb"] Dec 04 10:57:08 crc kubenswrapper[4693]: I1204 10:57:08.076056 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jlcz5" Dec 04 10:57:08 crc kubenswrapper[4693]: I1204 10:57:08.077043 4693 scope.go:117] "RemoveContainer" containerID="7b280fd1b3b6fb417e12fc55485533854a4da07437b29f4b5b140e9614f1d157" Dec 04 10:57:08 crc kubenswrapper[4693]: I1204 10:57:08.121956 4693 scope.go:117] "RemoveContainer" containerID="84795a13744cde80d388e32edd67facad9215b0f6481de4b8867c73b5d9293e1" Dec 04 10:57:08 crc kubenswrapper[4693]: E1204 10:57:08.122290 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84795a13744cde80d388e32edd67facad9215b0f6481de4b8867c73b5d9293e1\": container with ID starting with 84795a13744cde80d388e32edd67facad9215b0f6481de4b8867c73b5d9293e1 not found: ID does not exist" containerID="84795a13744cde80d388e32edd67facad9215b0f6481de4b8867c73b5d9293e1" Dec 04 10:57:08 crc kubenswrapper[4693]: I1204 10:57:08.122346 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84795a13744cde80d388e32edd67facad9215b0f6481de4b8867c73b5d9293e1"} err="failed to get container status \"84795a13744cde80d388e32edd67facad9215b0f6481de4b8867c73b5d9293e1\": rpc error: code = NotFound desc = could not find container \"84795a13744cde80d388e32edd67facad9215b0f6481de4b8867c73b5d9293e1\": container with ID starting with 84795a13744cde80d388e32edd67facad9215b0f6481de4b8867c73b5d9293e1 not found: ID does not exist" Dec 04 10:57:08 crc kubenswrapper[4693]: I1204 10:57:08.122386 4693 scope.go:117] "RemoveContainer" containerID="a77f7b52340e9d8725338d0def4635ecd5be8f9e807176442844761972d55ea0" Dec 04 10:57:08 crc kubenswrapper[4693]: E1204 10:57:08.122846 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a77f7b52340e9d8725338d0def4635ecd5be8f9e807176442844761972d55ea0\": container with ID starting with a77f7b52340e9d8725338d0def4635ecd5be8f9e807176442844761972d55ea0 not found: ID does not exist" containerID="a77f7b52340e9d8725338d0def4635ecd5be8f9e807176442844761972d55ea0" Dec 04 10:57:08 crc kubenswrapper[4693]: I1204 10:57:08.122875 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a77f7b52340e9d8725338d0def4635ecd5be8f9e807176442844761972d55ea0"} err="failed to get container status \"a77f7b52340e9d8725338d0def4635ecd5be8f9e807176442844761972d55ea0\": rpc error: code = NotFound desc = could not find container \"a77f7b52340e9d8725338d0def4635ecd5be8f9e807176442844761972d55ea0\": container with ID starting with a77f7b52340e9d8725338d0def4635ecd5be8f9e807176442844761972d55ea0 not found: ID does not exist" Dec 04 10:57:08 crc kubenswrapper[4693]: I1204 10:57:08.122899 4693 scope.go:117] "RemoveContainer" containerID="7b280fd1b3b6fb417e12fc55485533854a4da07437b29f4b5b140e9614f1d157" Dec 04 10:57:08 crc kubenswrapper[4693]: E1204 10:57:08.123221 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b280fd1b3b6fb417e12fc55485533854a4da07437b29f4b5b140e9614f1d157\": container with ID starting with 7b280fd1b3b6fb417e12fc55485533854a4da07437b29f4b5b140e9614f1d157 not found: ID does not exist" containerID="7b280fd1b3b6fb417e12fc55485533854a4da07437b29f4b5b140e9614f1d157" Dec 04 10:57:08 crc kubenswrapper[4693]: I1204 10:57:08.123241 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b280fd1b3b6fb417e12fc55485533854a4da07437b29f4b5b140e9614f1d157"} err="failed to get container status \"7b280fd1b3b6fb417e12fc55485533854a4da07437b29f4b5b140e9614f1d157\": rpc error: code = NotFound desc = could not find container \"7b280fd1b3b6fb417e12fc55485533854a4da07437b29f4b5b140e9614f1d157\": container with ID starting with 7b280fd1b3b6fb417e12fc55485533854a4da07437b29f4b5b140e9614f1d157 not found: ID does not exist" Dec 04 10:57:08 crc kubenswrapper[4693]: I1204 10:57:08.473249 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8280e8f5-d660-4200-8762-04c377358cb5" path="/var/lib/kubelet/pods/8280e8f5-d660-4200-8762-04c377358cb5/volumes" Dec 04 10:57:10 crc kubenswrapper[4693]: I1204 10:57:10.358958 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jlcz5"] Dec 04 10:57:10 crc kubenswrapper[4693]: I1204 10:57:10.359806 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jlcz5" podUID="fd2d4e91-6885-4639-903c-99ab41cb86a7" containerName="registry-server" containerID="cri-o://4e75d4732a7491bccd9e8aaf10a61267d2fcaf089ca780dd6ee7c35c9dcfd72a" gracePeriod=2 Dec 04 10:57:12 crc kubenswrapper[4693]: I1204 10:57:12.060045 4693 generic.go:334] "Generic (PLEG): container finished" podID="fd2d4e91-6885-4639-903c-99ab41cb86a7" containerID="4e75d4732a7491bccd9e8aaf10a61267d2fcaf089ca780dd6ee7c35c9dcfd72a" exitCode=0 Dec 04 10:57:12 crc kubenswrapper[4693]: I1204 10:57:12.060584 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jlcz5" event={"ID":"fd2d4e91-6885-4639-903c-99ab41cb86a7","Type":"ContainerDied","Data":"4e75d4732a7491bccd9e8aaf10a61267d2fcaf089ca780dd6ee7c35c9dcfd72a"} Dec 04 10:57:12 crc kubenswrapper[4693]: I1204 10:57:12.893674 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jlcz5" Dec 04 10:57:13 crc kubenswrapper[4693]: I1204 10:57:13.020222 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd2d4e91-6885-4639-903c-99ab41cb86a7-catalog-content\") pod \"fd2d4e91-6885-4639-903c-99ab41cb86a7\" (UID: \"fd2d4e91-6885-4639-903c-99ab41cb86a7\") " Dec 04 10:57:13 crc kubenswrapper[4693]: I1204 10:57:13.020322 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9clx\" (UniqueName: \"kubernetes.io/projected/fd2d4e91-6885-4639-903c-99ab41cb86a7-kube-api-access-l9clx\") pod \"fd2d4e91-6885-4639-903c-99ab41cb86a7\" (UID: \"fd2d4e91-6885-4639-903c-99ab41cb86a7\") " Dec 04 10:57:13 crc kubenswrapper[4693]: I1204 10:57:13.020416 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd2d4e91-6885-4639-903c-99ab41cb86a7-utilities\") pod \"fd2d4e91-6885-4639-903c-99ab41cb86a7\" (UID: \"fd2d4e91-6885-4639-903c-99ab41cb86a7\") " Dec 04 10:57:13 crc kubenswrapper[4693]: I1204 10:57:13.021158 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd2d4e91-6885-4639-903c-99ab41cb86a7-utilities" (OuterVolumeSpecName: "utilities") pod "fd2d4e91-6885-4639-903c-99ab41cb86a7" (UID: "fd2d4e91-6885-4639-903c-99ab41cb86a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:57:13 crc kubenswrapper[4693]: I1204 10:57:13.035647 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd2d4e91-6885-4639-903c-99ab41cb86a7-kube-api-access-l9clx" (OuterVolumeSpecName: "kube-api-access-l9clx") pod "fd2d4e91-6885-4639-903c-99ab41cb86a7" (UID: "fd2d4e91-6885-4639-903c-99ab41cb86a7"). InnerVolumeSpecName "kube-api-access-l9clx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:57:13 crc kubenswrapper[4693]: I1204 10:57:13.070368 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jlcz5" event={"ID":"fd2d4e91-6885-4639-903c-99ab41cb86a7","Type":"ContainerDied","Data":"6f1368ab178354ffe3e94cf0e0878f118400d6de1fcf1874e7d095424163703f"} Dec 04 10:57:13 crc kubenswrapper[4693]: I1204 10:57:13.070420 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jlcz5" Dec 04 10:57:13 crc kubenswrapper[4693]: I1204 10:57:13.070427 4693 scope.go:117] "RemoveContainer" containerID="4e75d4732a7491bccd9e8aaf10a61267d2fcaf089ca780dd6ee7c35c9dcfd72a" Dec 04 10:57:13 crc kubenswrapper[4693]: I1204 10:57:13.073043 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd2d4e91-6885-4639-903c-99ab41cb86a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd2d4e91-6885-4639-903c-99ab41cb86a7" (UID: "fd2d4e91-6885-4639-903c-99ab41cb86a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:57:13 crc kubenswrapper[4693]: I1204 10:57:13.093685 4693 scope.go:117] "RemoveContainer" containerID="be37b23226974f60cae8c16a148f05022fc70f60a51de7bf540624f84c5cde22" Dec 04 10:57:13 crc kubenswrapper[4693]: I1204 10:57:13.122964 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd2d4e91-6885-4639-903c-99ab41cb86a7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:57:13 crc kubenswrapper[4693]: I1204 10:57:13.122995 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9clx\" (UniqueName: \"kubernetes.io/projected/fd2d4e91-6885-4639-903c-99ab41cb86a7-kube-api-access-l9clx\") on node \"crc\" DevicePath \"\"" Dec 04 10:57:13 crc kubenswrapper[4693]: I1204 10:57:13.123006 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd2d4e91-6885-4639-903c-99ab41cb86a7-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:57:13 crc kubenswrapper[4693]: I1204 10:57:13.137014 4693 scope.go:117] "RemoveContainer" containerID="5f0990094bd17407ede0babf59766a14b4af6bf3a2ece13a5583ed6ce478f928" Dec 04 10:57:13 crc kubenswrapper[4693]: I1204 10:57:13.403797 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jlcz5"] Dec 04 10:57:13 crc kubenswrapper[4693]: I1204 10:57:13.412950 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jlcz5"] Dec 04 10:57:14 crc kubenswrapper[4693]: I1204 10:57:14.471807 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd2d4e91-6885-4639-903c-99ab41cb86a7" path="/var/lib/kubelet/pods/fd2d4e91-6885-4639-903c-99ab41cb86a7/volumes" Dec 04 10:57:22 crc kubenswrapper[4693]: I1204 10:57:22.272570 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:57:22 crc kubenswrapper[4693]: I1204 10:57:22.273120 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:57:22 crc kubenswrapper[4693]: I1204 10:57:22.273173 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 10:57:22 crc kubenswrapper[4693]: I1204 10:57:22.273955 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:57:22 crc kubenswrapper[4693]: I1204 10:57:22.274010 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" gracePeriod=600 Dec 04 10:57:22 crc kubenswrapper[4693]: E1204 10:57:22.793392 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:57:23 crc kubenswrapper[4693]: I1204 10:57:23.193668 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" exitCode=0 Dec 04 10:57:23 crc kubenswrapper[4693]: I1204 10:57:23.193757 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d"} Dec 04 10:57:23 crc kubenswrapper[4693]: I1204 10:57:23.194085 4693 scope.go:117] "RemoveContainer" containerID="fe10638fb663763e0faff0a79722de2da773bc9f1713bf195673135eae12a858" Dec 04 10:57:23 crc kubenswrapper[4693]: I1204 10:57:23.194681 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 10:57:23 crc kubenswrapper[4693]: E1204 10:57:23.195012 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:57:35 crc kubenswrapper[4693]: I1204 10:57:35.461224 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 10:57:35 crc kubenswrapper[4693]: E1204 10:57:35.462088 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:57:48 crc kubenswrapper[4693]: I1204 10:57:48.461449 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 10:57:48 crc kubenswrapper[4693]: E1204 10:57:48.462254 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:58:00 crc kubenswrapper[4693]: I1204 10:58:00.461482 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 10:58:00 crc kubenswrapper[4693]: E1204 10:58:00.462381 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:58:11 crc kubenswrapper[4693]: I1204 10:58:11.461138 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 10:58:11 crc kubenswrapper[4693]: E1204 10:58:11.461996 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:58:23 crc kubenswrapper[4693]: I1204 10:58:23.462064 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 10:58:23 crc kubenswrapper[4693]: E1204 10:58:23.462987 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:58:35 crc kubenswrapper[4693]: I1204 10:58:35.462238 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 10:58:35 crc kubenswrapper[4693]: E1204 10:58:35.464112 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:58:47 crc kubenswrapper[4693]: I1204 10:58:47.461798 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 10:58:47 crc kubenswrapper[4693]: E1204 10:58:47.462674 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:58:58 crc kubenswrapper[4693]: I1204 10:58:58.461100 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 10:58:58 crc kubenswrapper[4693]: E1204 10:58:58.461956 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.079504 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-chtqr"] Dec 04 10:59:02 crc kubenswrapper[4693]: E1204 10:59:02.080661 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd2d4e91-6885-4639-903c-99ab41cb86a7" containerName="registry-server" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.080678 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd2d4e91-6885-4639-903c-99ab41cb86a7" containerName="registry-server" Dec 04 10:59:02 crc kubenswrapper[4693]: E1204 10:59:02.080689 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8280e8f5-d660-4200-8762-04c377358cb5" containerName="registry-server" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.080696 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="8280e8f5-d660-4200-8762-04c377358cb5" containerName="registry-server" Dec 04 10:59:02 crc kubenswrapper[4693]: E1204 10:59:02.080733 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd2d4e91-6885-4639-903c-99ab41cb86a7" containerName="extract-utilities" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.080739 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd2d4e91-6885-4639-903c-99ab41cb86a7" containerName="extract-utilities" Dec 04 10:59:02 crc kubenswrapper[4693]: E1204 10:59:02.080749 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8280e8f5-d660-4200-8762-04c377358cb5" containerName="extract-content" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.080754 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="8280e8f5-d660-4200-8762-04c377358cb5" containerName="extract-content" Dec 04 10:59:02 crc kubenswrapper[4693]: E1204 10:59:02.080771 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8280e8f5-d660-4200-8762-04c377358cb5" containerName="extract-utilities" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.080777 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="8280e8f5-d660-4200-8762-04c377358cb5" containerName="extract-utilities" Dec 04 10:59:02 crc kubenswrapper[4693]: E1204 10:59:02.080786 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd2d4e91-6885-4639-903c-99ab41cb86a7" containerName="extract-content" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.080791 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd2d4e91-6885-4639-903c-99ab41cb86a7" containerName="extract-content" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.080972 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="8280e8f5-d660-4200-8762-04c377358cb5" containerName="registry-server" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.080992 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd2d4e91-6885-4639-903c-99ab41cb86a7" containerName="registry-server" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.082631 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-chtqr" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.096159 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-chtqr"] Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.237580 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eb9afe-ec99-4a3e-9304-f5114731da4b-utilities\") pod \"redhat-marketplace-chtqr\" (UID: \"d3eb9afe-ec99-4a3e-9304-f5114731da4b\") " pod="openshift-marketplace/redhat-marketplace-chtqr" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.237664 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eb9afe-ec99-4a3e-9304-f5114731da4b-catalog-content\") pod \"redhat-marketplace-chtqr\" (UID: \"d3eb9afe-ec99-4a3e-9304-f5114731da4b\") " pod="openshift-marketplace/redhat-marketplace-chtqr" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.237759 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c485k\" (UniqueName: \"kubernetes.io/projected/d3eb9afe-ec99-4a3e-9304-f5114731da4b-kube-api-access-c485k\") pod \"redhat-marketplace-chtqr\" (UID: \"d3eb9afe-ec99-4a3e-9304-f5114731da4b\") " pod="openshift-marketplace/redhat-marketplace-chtqr" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.339662 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eb9afe-ec99-4a3e-9304-f5114731da4b-utilities\") pod \"redhat-marketplace-chtqr\" (UID: \"d3eb9afe-ec99-4a3e-9304-f5114731da4b\") " pod="openshift-marketplace/redhat-marketplace-chtqr" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.339760 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eb9afe-ec99-4a3e-9304-f5114731da4b-catalog-content\") pod \"redhat-marketplace-chtqr\" (UID: \"d3eb9afe-ec99-4a3e-9304-f5114731da4b\") " pod="openshift-marketplace/redhat-marketplace-chtqr" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.339840 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c485k\" (UniqueName: \"kubernetes.io/projected/d3eb9afe-ec99-4a3e-9304-f5114731da4b-kube-api-access-c485k\") pod \"redhat-marketplace-chtqr\" (UID: \"d3eb9afe-ec99-4a3e-9304-f5114731da4b\") " pod="openshift-marketplace/redhat-marketplace-chtqr" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.340184 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eb9afe-ec99-4a3e-9304-f5114731da4b-utilities\") pod \"redhat-marketplace-chtqr\" (UID: \"d3eb9afe-ec99-4a3e-9304-f5114731da4b\") " pod="openshift-marketplace/redhat-marketplace-chtqr" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.340261 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eb9afe-ec99-4a3e-9304-f5114731da4b-catalog-content\") pod \"redhat-marketplace-chtqr\" (UID: \"d3eb9afe-ec99-4a3e-9304-f5114731da4b\") " pod="openshift-marketplace/redhat-marketplace-chtqr" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.359093 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c485k\" (UniqueName: \"kubernetes.io/projected/d3eb9afe-ec99-4a3e-9304-f5114731da4b-kube-api-access-c485k\") pod \"redhat-marketplace-chtqr\" (UID: \"d3eb9afe-ec99-4a3e-9304-f5114731da4b\") " pod="openshift-marketplace/redhat-marketplace-chtqr" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.403527 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-chtqr" Dec 04 10:59:02 crc kubenswrapper[4693]: I1204 10:59:02.977926 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-chtqr"] Dec 04 10:59:03 crc kubenswrapper[4693]: I1204 10:59:03.053442 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chtqr" event={"ID":"d3eb9afe-ec99-4a3e-9304-f5114731da4b","Type":"ContainerStarted","Data":"592a987b48bdb9069fd4a2c4c63e24ce799bc1a722a593103f7b2b7dbd2a8bad"} Dec 04 10:59:05 crc kubenswrapper[4693]: I1204 10:59:05.072327 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chtqr" event={"ID":"d3eb9afe-ec99-4a3e-9304-f5114731da4b","Type":"ContainerStarted","Data":"01a24848c803ca29515269124fcad0622464f385c14a0654263fd5bb6de47033"} Dec 04 10:59:06 crc kubenswrapper[4693]: I1204 10:59:06.082301 4693 generic.go:334] "Generic (PLEG): container finished" podID="d3eb9afe-ec99-4a3e-9304-f5114731da4b" containerID="01a24848c803ca29515269124fcad0622464f385c14a0654263fd5bb6de47033" exitCode=0 Dec 04 10:59:06 crc kubenswrapper[4693]: I1204 10:59:06.082455 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chtqr" event={"ID":"d3eb9afe-ec99-4a3e-9304-f5114731da4b","Type":"ContainerDied","Data":"01a24848c803ca29515269124fcad0622464f385c14a0654263fd5bb6de47033"} Dec 04 10:59:08 crc kubenswrapper[4693]: I1204 10:59:08.109941 4693 generic.go:334] "Generic (PLEG): container finished" podID="d3eb9afe-ec99-4a3e-9304-f5114731da4b" containerID="6f5dc430c50485d41c8c36d29b3a333383e3497203e1980f029bc7a845ae0575" exitCode=0 Dec 04 10:59:08 crc kubenswrapper[4693]: I1204 10:59:08.110014 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chtqr" event={"ID":"d3eb9afe-ec99-4a3e-9304-f5114731da4b","Type":"ContainerDied","Data":"6f5dc430c50485d41c8c36d29b3a333383e3497203e1980f029bc7a845ae0575"} Dec 04 10:59:09 crc kubenswrapper[4693]: I1204 10:59:09.123074 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chtqr" event={"ID":"d3eb9afe-ec99-4a3e-9304-f5114731da4b","Type":"ContainerStarted","Data":"0e76dca42ccef0e5da677ab1c8a3ce0431b0a0017fcd87e17c198a7d9e2f1687"} Dec 04 10:59:09 crc kubenswrapper[4693]: I1204 10:59:09.150750 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-chtqr" podStartSLOduration=4.719963261 podStartE2EDuration="7.150722606s" podCreationTimestamp="2025-12-04 10:59:02 +0000 UTC" firstStartedPulling="2025-12-04 10:59:06.084760854 +0000 UTC m=+4591.982354607" lastFinishedPulling="2025-12-04 10:59:08.515520209 +0000 UTC m=+4594.413113952" observedRunningTime="2025-12-04 10:59:09.144067927 +0000 UTC m=+4595.041661680" watchObservedRunningTime="2025-12-04 10:59:09.150722606 +0000 UTC m=+4595.048316359" Dec 04 10:59:12 crc kubenswrapper[4693]: I1204 10:59:12.404654 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-chtqr" Dec 04 10:59:12 crc kubenswrapper[4693]: I1204 10:59:12.405303 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-chtqr" Dec 04 10:59:12 crc kubenswrapper[4693]: I1204 10:59:12.453309 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-chtqr" Dec 04 10:59:12 crc kubenswrapper[4693]: I1204 10:59:12.461561 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 10:59:12 crc kubenswrapper[4693]: E1204 10:59:12.462034 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:59:13 crc kubenswrapper[4693]: I1204 10:59:13.230446 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-chtqr" Dec 04 10:59:13 crc kubenswrapper[4693]: I1204 10:59:13.280896 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-chtqr"] Dec 04 10:59:15 crc kubenswrapper[4693]: I1204 10:59:15.182857 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-chtqr" podUID="d3eb9afe-ec99-4a3e-9304-f5114731da4b" containerName="registry-server" containerID="cri-o://0e76dca42ccef0e5da677ab1c8a3ce0431b0a0017fcd87e17c198a7d9e2f1687" gracePeriod=2 Dec 04 10:59:15 crc kubenswrapper[4693]: I1204 10:59:15.769270 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-chtqr" Dec 04 10:59:15 crc kubenswrapper[4693]: I1204 10:59:15.927293 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eb9afe-ec99-4a3e-9304-f5114731da4b-catalog-content\") pod \"d3eb9afe-ec99-4a3e-9304-f5114731da4b\" (UID: \"d3eb9afe-ec99-4a3e-9304-f5114731da4b\") " Dec 04 10:59:15 crc kubenswrapper[4693]: I1204 10:59:15.937273 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c485k\" (UniqueName: \"kubernetes.io/projected/d3eb9afe-ec99-4a3e-9304-f5114731da4b-kube-api-access-c485k\") pod \"d3eb9afe-ec99-4a3e-9304-f5114731da4b\" (UID: \"d3eb9afe-ec99-4a3e-9304-f5114731da4b\") " Dec 04 10:59:15 crc kubenswrapper[4693]: I1204 10:59:15.937358 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eb9afe-ec99-4a3e-9304-f5114731da4b-utilities\") pod \"d3eb9afe-ec99-4a3e-9304-f5114731da4b\" (UID: \"d3eb9afe-ec99-4a3e-9304-f5114731da4b\") " Dec 04 10:59:15 crc kubenswrapper[4693]: I1204 10:59:15.939251 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3eb9afe-ec99-4a3e-9304-f5114731da4b-utilities" (OuterVolumeSpecName: "utilities") pod "d3eb9afe-ec99-4a3e-9304-f5114731da4b" (UID: "d3eb9afe-ec99-4a3e-9304-f5114731da4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:59:15 crc kubenswrapper[4693]: I1204 10:59:15.946500 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3eb9afe-ec99-4a3e-9304-f5114731da4b-kube-api-access-c485k" (OuterVolumeSpecName: "kube-api-access-c485k") pod "d3eb9afe-ec99-4a3e-9304-f5114731da4b" (UID: "d3eb9afe-ec99-4a3e-9304-f5114731da4b"). InnerVolumeSpecName "kube-api-access-c485k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:59:15 crc kubenswrapper[4693]: I1204 10:59:15.949510 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3eb9afe-ec99-4a3e-9304-f5114731da4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3eb9afe-ec99-4a3e-9304-f5114731da4b" (UID: "d3eb9afe-ec99-4a3e-9304-f5114731da4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:59:16 crc kubenswrapper[4693]: I1204 10:59:16.039973 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c485k\" (UniqueName: \"kubernetes.io/projected/d3eb9afe-ec99-4a3e-9304-f5114731da4b-kube-api-access-c485k\") on node \"crc\" DevicePath \"\"" Dec 04 10:59:16 crc kubenswrapper[4693]: I1204 10:59:16.040012 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eb9afe-ec99-4a3e-9304-f5114731da4b-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:59:16 crc kubenswrapper[4693]: I1204 10:59:16.040022 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eb9afe-ec99-4a3e-9304-f5114731da4b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:59:16 crc kubenswrapper[4693]: I1204 10:59:16.195226 4693 generic.go:334] "Generic (PLEG): container finished" podID="d3eb9afe-ec99-4a3e-9304-f5114731da4b" containerID="0e76dca42ccef0e5da677ab1c8a3ce0431b0a0017fcd87e17c198a7d9e2f1687" exitCode=0 Dec 04 10:59:16 crc kubenswrapper[4693]: I1204 10:59:16.195266 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chtqr" event={"ID":"d3eb9afe-ec99-4a3e-9304-f5114731da4b","Type":"ContainerDied","Data":"0e76dca42ccef0e5da677ab1c8a3ce0431b0a0017fcd87e17c198a7d9e2f1687"} Dec 04 10:59:16 crc kubenswrapper[4693]: I1204 10:59:16.195293 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chtqr" event={"ID":"d3eb9afe-ec99-4a3e-9304-f5114731da4b","Type":"ContainerDied","Data":"592a987b48bdb9069fd4a2c4c63e24ce799bc1a722a593103f7b2b7dbd2a8bad"} Dec 04 10:59:16 crc kubenswrapper[4693]: I1204 10:59:16.195309 4693 scope.go:117] "RemoveContainer" containerID="0e76dca42ccef0e5da677ab1c8a3ce0431b0a0017fcd87e17c198a7d9e2f1687" Dec 04 10:59:16 crc kubenswrapper[4693]: I1204 10:59:16.195435 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-chtqr" Dec 04 10:59:16 crc kubenswrapper[4693]: I1204 10:59:16.227515 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-chtqr"] Dec 04 10:59:16 crc kubenswrapper[4693]: I1204 10:59:16.241036 4693 scope.go:117] "RemoveContainer" containerID="6f5dc430c50485d41c8c36d29b3a333383e3497203e1980f029bc7a845ae0575" Dec 04 10:59:16 crc kubenswrapper[4693]: I1204 10:59:16.242194 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-chtqr"] Dec 04 10:59:16 crc kubenswrapper[4693]: I1204 10:59:16.271234 4693 scope.go:117] "RemoveContainer" containerID="01a24848c803ca29515269124fcad0622464f385c14a0654263fd5bb6de47033" Dec 04 10:59:16 crc kubenswrapper[4693]: I1204 10:59:16.326279 4693 scope.go:117] "RemoveContainer" containerID="0e76dca42ccef0e5da677ab1c8a3ce0431b0a0017fcd87e17c198a7d9e2f1687" Dec 04 10:59:16 crc kubenswrapper[4693]: E1204 10:59:16.326800 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e76dca42ccef0e5da677ab1c8a3ce0431b0a0017fcd87e17c198a7d9e2f1687\": container with ID starting with 0e76dca42ccef0e5da677ab1c8a3ce0431b0a0017fcd87e17c198a7d9e2f1687 not found: ID does not exist" containerID="0e76dca42ccef0e5da677ab1c8a3ce0431b0a0017fcd87e17c198a7d9e2f1687" Dec 04 10:59:16 crc kubenswrapper[4693]: I1204 10:59:16.326837 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e76dca42ccef0e5da677ab1c8a3ce0431b0a0017fcd87e17c198a7d9e2f1687"} err="failed to get container status \"0e76dca42ccef0e5da677ab1c8a3ce0431b0a0017fcd87e17c198a7d9e2f1687\": rpc error: code = NotFound desc = could not find container \"0e76dca42ccef0e5da677ab1c8a3ce0431b0a0017fcd87e17c198a7d9e2f1687\": container with ID starting with 0e76dca42ccef0e5da677ab1c8a3ce0431b0a0017fcd87e17c198a7d9e2f1687 not found: ID does not exist" Dec 04 10:59:16 crc kubenswrapper[4693]: I1204 10:59:16.326865 4693 scope.go:117] "RemoveContainer" containerID="6f5dc430c50485d41c8c36d29b3a333383e3497203e1980f029bc7a845ae0575" Dec 04 10:59:16 crc kubenswrapper[4693]: E1204 10:59:16.327308 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f5dc430c50485d41c8c36d29b3a333383e3497203e1980f029bc7a845ae0575\": container with ID starting with 6f5dc430c50485d41c8c36d29b3a333383e3497203e1980f029bc7a845ae0575 not found: ID does not exist" containerID="6f5dc430c50485d41c8c36d29b3a333383e3497203e1980f029bc7a845ae0575" Dec 04 10:59:16 crc kubenswrapper[4693]: I1204 10:59:16.327352 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f5dc430c50485d41c8c36d29b3a333383e3497203e1980f029bc7a845ae0575"} err="failed to get container status \"6f5dc430c50485d41c8c36d29b3a333383e3497203e1980f029bc7a845ae0575\": rpc error: code = NotFound desc = could not find container \"6f5dc430c50485d41c8c36d29b3a333383e3497203e1980f029bc7a845ae0575\": container with ID starting with 6f5dc430c50485d41c8c36d29b3a333383e3497203e1980f029bc7a845ae0575 not found: ID does not exist" Dec 04 10:59:16 crc kubenswrapper[4693]: I1204 10:59:16.327371 4693 scope.go:117] "RemoveContainer" containerID="01a24848c803ca29515269124fcad0622464f385c14a0654263fd5bb6de47033" Dec 04 10:59:16 crc kubenswrapper[4693]: E1204 10:59:16.327622 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01a24848c803ca29515269124fcad0622464f385c14a0654263fd5bb6de47033\": container with ID starting with 01a24848c803ca29515269124fcad0622464f385c14a0654263fd5bb6de47033 not found: ID does not exist" containerID="01a24848c803ca29515269124fcad0622464f385c14a0654263fd5bb6de47033" Dec 04 10:59:16 crc kubenswrapper[4693]: I1204 10:59:16.327646 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01a24848c803ca29515269124fcad0622464f385c14a0654263fd5bb6de47033"} err="failed to get container status \"01a24848c803ca29515269124fcad0622464f385c14a0654263fd5bb6de47033\": rpc error: code = NotFound desc = could not find container \"01a24848c803ca29515269124fcad0622464f385c14a0654263fd5bb6de47033\": container with ID starting with 01a24848c803ca29515269124fcad0622464f385c14a0654263fd5bb6de47033 not found: ID does not exist" Dec 04 10:59:16 crc kubenswrapper[4693]: I1204 10:59:16.473946 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3eb9afe-ec99-4a3e-9304-f5114731da4b" path="/var/lib/kubelet/pods/d3eb9afe-ec99-4a3e-9304-f5114731da4b/volumes" Dec 04 10:59:24 crc kubenswrapper[4693]: I1204 10:59:24.468240 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 10:59:24 crc kubenswrapper[4693]: E1204 10:59:24.469106 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:59:39 crc kubenswrapper[4693]: I1204 10:59:39.460952 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 10:59:39 crc kubenswrapper[4693]: E1204 10:59:39.461711 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 10:59:52 crc kubenswrapper[4693]: I1204 10:59:52.461118 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 10:59:52 crc kubenswrapper[4693]: E1204 10:59:52.461968 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.169311 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf"] Dec 04 11:00:00 crc kubenswrapper[4693]: E1204 11:00:00.170396 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3eb9afe-ec99-4a3e-9304-f5114731da4b" containerName="extract-utilities" Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.170424 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3eb9afe-ec99-4a3e-9304-f5114731da4b" containerName="extract-utilities" Dec 04 11:00:00 crc kubenswrapper[4693]: E1204 11:00:00.170437 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3eb9afe-ec99-4a3e-9304-f5114731da4b" containerName="extract-content" Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.170445 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3eb9afe-ec99-4a3e-9304-f5114731da4b" containerName="extract-content" Dec 04 11:00:00 crc kubenswrapper[4693]: E1204 11:00:00.170471 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3eb9afe-ec99-4a3e-9304-f5114731da4b" containerName="registry-server" Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.170479 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3eb9afe-ec99-4a3e-9304-f5114731da4b" containerName="registry-server" Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.170697 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3eb9afe-ec99-4a3e-9304-f5114731da4b" containerName="registry-server" Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.171373 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf" Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.173563 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.174455 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.193091 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf"] Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.341835 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1ce260a-394b-4901-b6ee-6ea481adfaf7-config-volume\") pod \"collect-profiles-29414100-pwzwf\" (UID: \"a1ce260a-394b-4901-b6ee-6ea481adfaf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf" Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.341907 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmdgk\" (UniqueName: \"kubernetes.io/projected/a1ce260a-394b-4901-b6ee-6ea481adfaf7-kube-api-access-rmdgk\") pod \"collect-profiles-29414100-pwzwf\" (UID: \"a1ce260a-394b-4901-b6ee-6ea481adfaf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf" Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.341978 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1ce260a-394b-4901-b6ee-6ea481adfaf7-secret-volume\") pod \"collect-profiles-29414100-pwzwf\" (UID: \"a1ce260a-394b-4901-b6ee-6ea481adfaf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf" Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.444342 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1ce260a-394b-4901-b6ee-6ea481adfaf7-config-volume\") pod \"collect-profiles-29414100-pwzwf\" (UID: \"a1ce260a-394b-4901-b6ee-6ea481adfaf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf" Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.444416 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmdgk\" (UniqueName: \"kubernetes.io/projected/a1ce260a-394b-4901-b6ee-6ea481adfaf7-kube-api-access-rmdgk\") pod \"collect-profiles-29414100-pwzwf\" (UID: \"a1ce260a-394b-4901-b6ee-6ea481adfaf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf" Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.444459 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1ce260a-394b-4901-b6ee-6ea481adfaf7-secret-volume\") pod \"collect-profiles-29414100-pwzwf\" (UID: \"a1ce260a-394b-4901-b6ee-6ea481adfaf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf" Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.445651 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1ce260a-394b-4901-b6ee-6ea481adfaf7-config-volume\") pod \"collect-profiles-29414100-pwzwf\" (UID: \"a1ce260a-394b-4901-b6ee-6ea481adfaf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf" Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.450760 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1ce260a-394b-4901-b6ee-6ea481adfaf7-secret-volume\") pod \"collect-profiles-29414100-pwzwf\" (UID: \"a1ce260a-394b-4901-b6ee-6ea481adfaf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf" Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.464666 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmdgk\" (UniqueName: \"kubernetes.io/projected/a1ce260a-394b-4901-b6ee-6ea481adfaf7-kube-api-access-rmdgk\") pod \"collect-profiles-29414100-pwzwf\" (UID: \"a1ce260a-394b-4901-b6ee-6ea481adfaf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf" Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.495962 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf" Dec 04 11:00:00 crc kubenswrapper[4693]: I1204 11:00:00.986070 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf"] Dec 04 11:00:01 crc kubenswrapper[4693]: I1204 11:00:01.606555 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf" event={"ID":"a1ce260a-394b-4901-b6ee-6ea481adfaf7","Type":"ContainerStarted","Data":"51e3e996797ed6ffbdfa418c7da74a0ed9dafc146804b51843db307402e4a2da"} Dec 04 11:00:01 crc kubenswrapper[4693]: I1204 11:00:01.606609 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf" event={"ID":"a1ce260a-394b-4901-b6ee-6ea481adfaf7","Type":"ContainerStarted","Data":"1ae70207658612fac5a9f446faeb299b9074e5011726957056b6dec713b93bbe"} Dec 04 11:00:01 crc kubenswrapper[4693]: I1204 11:00:01.630719 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf" podStartSLOduration=1.6306963140000001 podStartE2EDuration="1.630696314s" podCreationTimestamp="2025-12-04 11:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 11:00:01.626351857 +0000 UTC m=+4647.523945610" watchObservedRunningTime="2025-12-04 11:00:01.630696314 +0000 UTC m=+4647.528290067" Dec 04 11:00:02 crc kubenswrapper[4693]: I1204 11:00:02.616988 4693 generic.go:334] "Generic (PLEG): container finished" podID="a1ce260a-394b-4901-b6ee-6ea481adfaf7" containerID="51e3e996797ed6ffbdfa418c7da74a0ed9dafc146804b51843db307402e4a2da" exitCode=0 Dec 04 11:00:02 crc kubenswrapper[4693]: I1204 11:00:02.617061 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf" event={"ID":"a1ce260a-394b-4901-b6ee-6ea481adfaf7","Type":"ContainerDied","Data":"51e3e996797ed6ffbdfa418c7da74a0ed9dafc146804b51843db307402e4a2da"} Dec 04 11:00:04 crc kubenswrapper[4693]: I1204 11:00:04.135474 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf" Dec 04 11:00:04 crc kubenswrapper[4693]: I1204 11:00:04.249139 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1ce260a-394b-4901-b6ee-6ea481adfaf7-secret-volume\") pod \"a1ce260a-394b-4901-b6ee-6ea481adfaf7\" (UID: \"a1ce260a-394b-4901-b6ee-6ea481adfaf7\") " Dec 04 11:00:04 crc kubenswrapper[4693]: I1204 11:00:04.249409 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1ce260a-394b-4901-b6ee-6ea481adfaf7-config-volume\") pod \"a1ce260a-394b-4901-b6ee-6ea481adfaf7\" (UID: \"a1ce260a-394b-4901-b6ee-6ea481adfaf7\") " Dec 04 11:00:04 crc kubenswrapper[4693]: I1204 11:00:04.249472 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmdgk\" (UniqueName: \"kubernetes.io/projected/a1ce260a-394b-4901-b6ee-6ea481adfaf7-kube-api-access-rmdgk\") pod \"a1ce260a-394b-4901-b6ee-6ea481adfaf7\" (UID: \"a1ce260a-394b-4901-b6ee-6ea481adfaf7\") " Dec 04 11:00:04 crc kubenswrapper[4693]: I1204 11:00:04.250321 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1ce260a-394b-4901-b6ee-6ea481adfaf7-config-volume" (OuterVolumeSpecName: "config-volume") pod "a1ce260a-394b-4901-b6ee-6ea481adfaf7" (UID: "a1ce260a-394b-4901-b6ee-6ea481adfaf7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 11:00:04 crc kubenswrapper[4693]: I1204 11:00:04.255033 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1ce260a-394b-4901-b6ee-6ea481adfaf7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a1ce260a-394b-4901-b6ee-6ea481adfaf7" (UID: "a1ce260a-394b-4901-b6ee-6ea481adfaf7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:00:04 crc kubenswrapper[4693]: I1204 11:00:04.255496 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1ce260a-394b-4901-b6ee-6ea481adfaf7-kube-api-access-rmdgk" (OuterVolumeSpecName: "kube-api-access-rmdgk") pod "a1ce260a-394b-4901-b6ee-6ea481adfaf7" (UID: "a1ce260a-394b-4901-b6ee-6ea481adfaf7"). InnerVolumeSpecName "kube-api-access-rmdgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:00:04 crc kubenswrapper[4693]: I1204 11:00:04.352532 4693 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1ce260a-394b-4901-b6ee-6ea481adfaf7-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 11:00:04 crc kubenswrapper[4693]: I1204 11:00:04.352564 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmdgk\" (UniqueName: \"kubernetes.io/projected/a1ce260a-394b-4901-b6ee-6ea481adfaf7-kube-api-access-rmdgk\") on node \"crc\" DevicePath \"\"" Dec 04 11:00:04 crc kubenswrapper[4693]: I1204 11:00:04.352577 4693 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1ce260a-394b-4901-b6ee-6ea481adfaf7-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 11:00:04 crc kubenswrapper[4693]: I1204 11:00:04.652416 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf" event={"ID":"a1ce260a-394b-4901-b6ee-6ea481adfaf7","Type":"ContainerDied","Data":"1ae70207658612fac5a9f446faeb299b9074e5011726957056b6dec713b93bbe"} Dec 04 11:00:04 crc kubenswrapper[4693]: I1204 11:00:04.652459 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ae70207658612fac5a9f446faeb299b9074e5011726957056b6dec713b93bbe" Dec 04 11:00:04 crc kubenswrapper[4693]: I1204 11:00:04.652551 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-pwzwf" Dec 04 11:00:04 crc kubenswrapper[4693]: I1204 11:00:04.706460 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk"] Dec 04 11:00:04 crc kubenswrapper[4693]: I1204 11:00:04.714726 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414055-gfcpk"] Dec 04 11:00:05 crc kubenswrapper[4693]: I1204 11:00:05.462386 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 11:00:05 crc kubenswrapper[4693]: E1204 11:00:05.462934 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:00:06 crc kubenswrapper[4693]: I1204 11:00:06.539796 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea614f99-9d53-4a2e-b796-e2e603bac316" path="/var/lib/kubelet/pods/ea614f99-9d53-4a2e-b796-e2e603bac316/volumes" Dec 04 11:00:12 crc kubenswrapper[4693]: I1204 11:00:12.061547 4693 scope.go:117] "RemoveContainer" containerID="74364817e68e39e08e34c2f883ea44961f45d87f86e3c63a855eacbf953ec317" Dec 04 11:00:17 crc kubenswrapper[4693]: I1204 11:00:17.462045 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 11:00:17 crc kubenswrapper[4693]: E1204 11:00:17.462983 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:00:32 crc kubenswrapper[4693]: I1204 11:00:32.461545 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 11:00:32 crc kubenswrapper[4693]: E1204 11:00:32.462389 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:00:47 crc kubenswrapper[4693]: I1204 11:00:47.462048 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 11:00:47 crc kubenswrapper[4693]: E1204 11:00:47.462781 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:01:00 crc kubenswrapper[4693]: I1204 11:01:00.161444 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29414101-snzll"] Dec 04 11:01:00 crc kubenswrapper[4693]: E1204 11:01:00.162362 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ce260a-394b-4901-b6ee-6ea481adfaf7" containerName="collect-profiles" Dec 04 11:01:00 crc kubenswrapper[4693]: I1204 11:01:00.162375 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ce260a-394b-4901-b6ee-6ea481adfaf7" containerName="collect-profiles" Dec 04 11:01:00 crc kubenswrapper[4693]: I1204 11:01:00.162572 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1ce260a-394b-4901-b6ee-6ea481adfaf7" containerName="collect-profiles" Dec 04 11:01:00 crc kubenswrapper[4693]: I1204 11:01:00.163235 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414101-snzll" Dec 04 11:01:00 crc kubenswrapper[4693]: I1204 11:01:00.175570 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414101-snzll"] Dec 04 11:01:00 crc kubenswrapper[4693]: I1204 11:01:00.201099 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7a046b6-d7ff-46b3-a107-58e6e843bfff-config-data\") pod \"keystone-cron-29414101-snzll\" (UID: \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\") " pod="openstack/keystone-cron-29414101-snzll" Dec 04 11:01:00 crc kubenswrapper[4693]: I1204 11:01:00.201157 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e7a046b6-d7ff-46b3-a107-58e6e843bfff-fernet-keys\") pod \"keystone-cron-29414101-snzll\" (UID: \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\") " pod="openstack/keystone-cron-29414101-snzll" Dec 04 11:01:00 crc kubenswrapper[4693]: I1204 11:01:00.201178 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7a046b6-d7ff-46b3-a107-58e6e843bfff-combined-ca-bundle\") pod \"keystone-cron-29414101-snzll\" (UID: \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\") " pod="openstack/keystone-cron-29414101-snzll" Dec 04 11:01:00 crc kubenswrapper[4693]: I1204 11:01:00.201230 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhgmr\" (UniqueName: \"kubernetes.io/projected/e7a046b6-d7ff-46b3-a107-58e6e843bfff-kube-api-access-jhgmr\") pod \"keystone-cron-29414101-snzll\" (UID: \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\") " pod="openstack/keystone-cron-29414101-snzll" Dec 04 11:01:00 crc kubenswrapper[4693]: I1204 11:01:00.304287 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7a046b6-d7ff-46b3-a107-58e6e843bfff-config-data\") pod \"keystone-cron-29414101-snzll\" (UID: \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\") " pod="openstack/keystone-cron-29414101-snzll" Dec 04 11:01:00 crc kubenswrapper[4693]: I1204 11:01:00.304395 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e7a046b6-d7ff-46b3-a107-58e6e843bfff-fernet-keys\") pod \"keystone-cron-29414101-snzll\" (UID: \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\") " pod="openstack/keystone-cron-29414101-snzll" Dec 04 11:01:00 crc kubenswrapper[4693]: I1204 11:01:00.304421 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7a046b6-d7ff-46b3-a107-58e6e843bfff-combined-ca-bundle\") pod \"keystone-cron-29414101-snzll\" (UID: \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\") " pod="openstack/keystone-cron-29414101-snzll" Dec 04 11:01:00 crc kubenswrapper[4693]: I1204 11:01:00.304499 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhgmr\" (UniqueName: \"kubernetes.io/projected/e7a046b6-d7ff-46b3-a107-58e6e843bfff-kube-api-access-jhgmr\") pod \"keystone-cron-29414101-snzll\" (UID: \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\") " pod="openstack/keystone-cron-29414101-snzll" Dec 04 11:01:00 crc kubenswrapper[4693]: I1204 11:01:00.317495 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7a046b6-d7ff-46b3-a107-58e6e843bfff-combined-ca-bundle\") pod \"keystone-cron-29414101-snzll\" (UID: \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\") " pod="openstack/keystone-cron-29414101-snzll" Dec 04 11:01:00 crc kubenswrapper[4693]: I1204 11:01:00.317570 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e7a046b6-d7ff-46b3-a107-58e6e843bfff-fernet-keys\") pod \"keystone-cron-29414101-snzll\" (UID: \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\") " pod="openstack/keystone-cron-29414101-snzll" Dec 04 11:01:00 crc kubenswrapper[4693]: I1204 11:01:00.317611 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7a046b6-d7ff-46b3-a107-58e6e843bfff-config-data\") pod \"keystone-cron-29414101-snzll\" (UID: \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\") " pod="openstack/keystone-cron-29414101-snzll" Dec 04 11:01:00 crc kubenswrapper[4693]: I1204 11:01:00.330549 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhgmr\" (UniqueName: \"kubernetes.io/projected/e7a046b6-d7ff-46b3-a107-58e6e843bfff-kube-api-access-jhgmr\") pod \"keystone-cron-29414101-snzll\" (UID: \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\") " pod="openstack/keystone-cron-29414101-snzll" Dec 04 11:01:00 crc kubenswrapper[4693]: I1204 11:01:00.482149 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414101-snzll" Dec 04 11:01:00 crc kubenswrapper[4693]: I1204 11:01:00.984389 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414101-snzll"] Dec 04 11:01:01 crc kubenswrapper[4693]: I1204 11:01:01.168519 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414101-snzll" event={"ID":"e7a046b6-d7ff-46b3-a107-58e6e843bfff","Type":"ContainerStarted","Data":"f46408b0b8d92bb1952b0f03504463e0b045e075ff467be42f3bc68ccf0f09da"} Dec 04 11:01:01 crc kubenswrapper[4693]: I1204 11:01:01.461283 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 11:01:01 crc kubenswrapper[4693]: E1204 11:01:01.461625 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:01:02 crc kubenswrapper[4693]: I1204 11:01:02.178546 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414101-snzll" event={"ID":"e7a046b6-d7ff-46b3-a107-58e6e843bfff","Type":"ContainerStarted","Data":"7c16bdaaaaca12ec15f602ac4a63c05e881363dcd83646c46272d8f89eb01f9a"} Dec 04 11:01:02 crc kubenswrapper[4693]: I1204 11:01:02.197185 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29414101-snzll" podStartSLOduration=2.197165381 podStartE2EDuration="2.197165381s" podCreationTimestamp="2025-12-04 11:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 11:01:02.194823128 +0000 UTC m=+4708.092416881" watchObservedRunningTime="2025-12-04 11:01:02.197165381 +0000 UTC m=+4708.094759144" Dec 04 11:01:05 crc kubenswrapper[4693]: I1204 11:01:05.211503 4693 generic.go:334] "Generic (PLEG): container finished" podID="e7a046b6-d7ff-46b3-a107-58e6e843bfff" containerID="7c16bdaaaaca12ec15f602ac4a63c05e881363dcd83646c46272d8f89eb01f9a" exitCode=0 Dec 04 11:01:05 crc kubenswrapper[4693]: I1204 11:01:05.211599 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414101-snzll" event={"ID":"e7a046b6-d7ff-46b3-a107-58e6e843bfff","Type":"ContainerDied","Data":"7c16bdaaaaca12ec15f602ac4a63c05e881363dcd83646c46272d8f89eb01f9a"} Dec 04 11:01:06 crc kubenswrapper[4693]: I1204 11:01:06.781387 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414101-snzll" Dec 04 11:01:06 crc kubenswrapper[4693]: I1204 11:01:06.849894 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7a046b6-d7ff-46b3-a107-58e6e843bfff-combined-ca-bundle\") pod \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\" (UID: \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\") " Dec 04 11:01:06 crc kubenswrapper[4693]: I1204 11:01:06.849988 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e7a046b6-d7ff-46b3-a107-58e6e843bfff-fernet-keys\") pod \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\" (UID: \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\") " Dec 04 11:01:06 crc kubenswrapper[4693]: I1204 11:01:06.850031 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7a046b6-d7ff-46b3-a107-58e6e843bfff-config-data\") pod \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\" (UID: \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\") " Dec 04 11:01:06 crc kubenswrapper[4693]: I1204 11:01:06.850081 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhgmr\" (UniqueName: \"kubernetes.io/projected/e7a046b6-d7ff-46b3-a107-58e6e843bfff-kube-api-access-jhgmr\") pod \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\" (UID: \"e7a046b6-d7ff-46b3-a107-58e6e843bfff\") " Dec 04 11:01:06 crc kubenswrapper[4693]: I1204 11:01:06.858399 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7a046b6-d7ff-46b3-a107-58e6e843bfff-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e7a046b6-d7ff-46b3-a107-58e6e843bfff" (UID: "e7a046b6-d7ff-46b3-a107-58e6e843bfff"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:01:06 crc kubenswrapper[4693]: I1204 11:01:06.871053 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7a046b6-d7ff-46b3-a107-58e6e843bfff-kube-api-access-jhgmr" (OuterVolumeSpecName: "kube-api-access-jhgmr") pod "e7a046b6-d7ff-46b3-a107-58e6e843bfff" (UID: "e7a046b6-d7ff-46b3-a107-58e6e843bfff"). InnerVolumeSpecName "kube-api-access-jhgmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:01:06 crc kubenswrapper[4693]: I1204 11:01:06.895769 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7a046b6-d7ff-46b3-a107-58e6e843bfff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7a046b6-d7ff-46b3-a107-58e6e843bfff" (UID: "e7a046b6-d7ff-46b3-a107-58e6e843bfff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:01:06 crc kubenswrapper[4693]: I1204 11:01:06.909682 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7a046b6-d7ff-46b3-a107-58e6e843bfff-config-data" (OuterVolumeSpecName: "config-data") pod "e7a046b6-d7ff-46b3-a107-58e6e843bfff" (UID: "e7a046b6-d7ff-46b3-a107-58e6e843bfff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:01:06 crc kubenswrapper[4693]: I1204 11:01:06.952387 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7a046b6-d7ff-46b3-a107-58e6e843bfff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 11:01:06 crc kubenswrapper[4693]: I1204 11:01:06.952420 4693 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e7a046b6-d7ff-46b3-a107-58e6e843bfff-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 11:01:06 crc kubenswrapper[4693]: I1204 11:01:06.952431 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7a046b6-d7ff-46b3-a107-58e6e843bfff-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 11:01:06 crc kubenswrapper[4693]: I1204 11:01:06.952444 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhgmr\" (UniqueName: \"kubernetes.io/projected/e7a046b6-d7ff-46b3-a107-58e6e843bfff-kube-api-access-jhgmr\") on node \"crc\" DevicePath \"\"" Dec 04 11:01:07 crc kubenswrapper[4693]: I1204 11:01:07.249180 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414101-snzll" event={"ID":"e7a046b6-d7ff-46b3-a107-58e6e843bfff","Type":"ContainerDied","Data":"f46408b0b8d92bb1952b0f03504463e0b045e075ff467be42f3bc68ccf0f09da"} Dec 04 11:01:07 crc kubenswrapper[4693]: I1204 11:01:07.249231 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f46408b0b8d92bb1952b0f03504463e0b045e075ff467be42f3bc68ccf0f09da" Dec 04 11:01:07 crc kubenswrapper[4693]: I1204 11:01:07.249317 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414101-snzll" Dec 04 11:01:16 crc kubenswrapper[4693]: I1204 11:01:16.461812 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 11:01:16 crc kubenswrapper[4693]: E1204 11:01:16.462584 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:01:30 crc kubenswrapper[4693]: I1204 11:01:30.461541 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 11:01:30 crc kubenswrapper[4693]: E1204 11:01:30.462298 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:01:44 crc kubenswrapper[4693]: I1204 11:01:44.475623 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 11:01:44 crc kubenswrapper[4693]: E1204 11:01:44.478213 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:01:58 crc kubenswrapper[4693]: I1204 11:01:58.460742 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 11:01:58 crc kubenswrapper[4693]: E1204 11:01:58.461522 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:02:12 crc kubenswrapper[4693]: I1204 11:02:12.461775 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 11:02:12 crc kubenswrapper[4693]: E1204 11:02:12.462533 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:02:27 crc kubenswrapper[4693]: I1204 11:02:27.461719 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 11:02:27 crc kubenswrapper[4693]: I1204 11:02:27.966568 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"d66556eececfe05300b379ec0667fc9849d19d6cada67a21fc7c4bd62f67fea2"} Dec 04 11:03:01 crc kubenswrapper[4693]: I1204 11:03:01.648897 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zjsb7"] Dec 04 11:03:01 crc kubenswrapper[4693]: E1204 11:03:01.649851 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a046b6-d7ff-46b3-a107-58e6e843bfff" containerName="keystone-cron" Dec 04 11:03:01 crc kubenswrapper[4693]: I1204 11:03:01.649927 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a046b6-d7ff-46b3-a107-58e6e843bfff" containerName="keystone-cron" Dec 04 11:03:01 crc kubenswrapper[4693]: I1204 11:03:01.650169 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7a046b6-d7ff-46b3-a107-58e6e843bfff" containerName="keystone-cron" Dec 04 11:03:01 crc kubenswrapper[4693]: I1204 11:03:01.653138 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zjsb7" Dec 04 11:03:01 crc kubenswrapper[4693]: I1204 11:03:01.664206 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zjsb7"] Dec 04 11:03:01 crc kubenswrapper[4693]: I1204 11:03:01.704247 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7c8266-2f13-4bf0-9d13-73062537290d-utilities\") pod \"community-operators-zjsb7\" (UID: \"1f7c8266-2f13-4bf0-9d13-73062537290d\") " pod="openshift-marketplace/community-operators-zjsb7" Dec 04 11:03:01 crc kubenswrapper[4693]: I1204 11:03:01.704451 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7c8266-2f13-4bf0-9d13-73062537290d-catalog-content\") pod \"community-operators-zjsb7\" (UID: \"1f7c8266-2f13-4bf0-9d13-73062537290d\") " pod="openshift-marketplace/community-operators-zjsb7" Dec 04 11:03:01 crc kubenswrapper[4693]: I1204 11:03:01.704505 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nzpx\" (UniqueName: \"kubernetes.io/projected/1f7c8266-2f13-4bf0-9d13-73062537290d-kube-api-access-4nzpx\") pod \"community-operators-zjsb7\" (UID: \"1f7c8266-2f13-4bf0-9d13-73062537290d\") " pod="openshift-marketplace/community-operators-zjsb7" Dec 04 11:03:01 crc kubenswrapper[4693]: I1204 11:03:01.806179 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7c8266-2f13-4bf0-9d13-73062537290d-catalog-content\") pod \"community-operators-zjsb7\" (UID: \"1f7c8266-2f13-4bf0-9d13-73062537290d\") " pod="openshift-marketplace/community-operators-zjsb7" Dec 04 11:03:01 crc kubenswrapper[4693]: I1204 11:03:01.806932 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7c8266-2f13-4bf0-9d13-73062537290d-catalog-content\") pod \"community-operators-zjsb7\" (UID: \"1f7c8266-2f13-4bf0-9d13-73062537290d\") " pod="openshift-marketplace/community-operators-zjsb7" Dec 04 11:03:01 crc kubenswrapper[4693]: I1204 11:03:01.806935 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nzpx\" (UniqueName: \"kubernetes.io/projected/1f7c8266-2f13-4bf0-9d13-73062537290d-kube-api-access-4nzpx\") pod \"community-operators-zjsb7\" (UID: \"1f7c8266-2f13-4bf0-9d13-73062537290d\") " pod="openshift-marketplace/community-operators-zjsb7" Dec 04 11:03:01 crc kubenswrapper[4693]: I1204 11:03:01.807116 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7c8266-2f13-4bf0-9d13-73062537290d-utilities\") pod \"community-operators-zjsb7\" (UID: \"1f7c8266-2f13-4bf0-9d13-73062537290d\") " pod="openshift-marketplace/community-operators-zjsb7" Dec 04 11:03:01 crc kubenswrapper[4693]: I1204 11:03:01.807598 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7c8266-2f13-4bf0-9d13-73062537290d-utilities\") pod \"community-operators-zjsb7\" (UID: \"1f7c8266-2f13-4bf0-9d13-73062537290d\") " pod="openshift-marketplace/community-operators-zjsb7" Dec 04 11:03:01 crc kubenswrapper[4693]: I1204 11:03:01.839457 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nzpx\" (UniqueName: \"kubernetes.io/projected/1f7c8266-2f13-4bf0-9d13-73062537290d-kube-api-access-4nzpx\") pod \"community-operators-zjsb7\" (UID: \"1f7c8266-2f13-4bf0-9d13-73062537290d\") " pod="openshift-marketplace/community-operators-zjsb7" Dec 04 11:03:01 crc kubenswrapper[4693]: I1204 11:03:01.970126 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zjsb7" Dec 04 11:03:02 crc kubenswrapper[4693]: I1204 11:03:02.530699 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zjsb7"] Dec 04 11:03:03 crc kubenswrapper[4693]: I1204 11:03:03.298576 4693 generic.go:334] "Generic (PLEG): container finished" podID="1f7c8266-2f13-4bf0-9d13-73062537290d" containerID="5a211bc06c7e54e4bb668f2c378e4bb68192ee31ea80d21f8a9b1041e5b84bb2" exitCode=0 Dec 04 11:03:03 crc kubenswrapper[4693]: I1204 11:03:03.298669 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zjsb7" event={"ID":"1f7c8266-2f13-4bf0-9d13-73062537290d","Type":"ContainerDied","Data":"5a211bc06c7e54e4bb668f2c378e4bb68192ee31ea80d21f8a9b1041e5b84bb2"} Dec 04 11:03:03 crc kubenswrapper[4693]: I1204 11:03:03.299606 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zjsb7" event={"ID":"1f7c8266-2f13-4bf0-9d13-73062537290d","Type":"ContainerStarted","Data":"1b0d79c591311ef5e79a32738f0ff5fddf8c1fa4c792b009e7730cf5eabf4bae"} Dec 04 11:03:03 crc kubenswrapper[4693]: I1204 11:03:03.301155 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 11:03:05 crc kubenswrapper[4693]: I1204 11:03:05.317789 4693 generic.go:334] "Generic (PLEG): container finished" podID="1f7c8266-2f13-4bf0-9d13-73062537290d" containerID="47b11dd41a1979a0f0b5fd9db53c27b50f1ef50fa74adc6ea982e7d6dab4e1a2" exitCode=0 Dec 04 11:03:05 crc kubenswrapper[4693]: I1204 11:03:05.317879 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zjsb7" event={"ID":"1f7c8266-2f13-4bf0-9d13-73062537290d","Type":"ContainerDied","Data":"47b11dd41a1979a0f0b5fd9db53c27b50f1ef50fa74adc6ea982e7d6dab4e1a2"} Dec 04 11:03:06 crc kubenswrapper[4693]: I1204 11:03:06.327994 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zjsb7" event={"ID":"1f7c8266-2f13-4bf0-9d13-73062537290d","Type":"ContainerStarted","Data":"5a69a0580af54375c50d8f566410e01bc2080c182f6e8888eff8a913f514afab"} Dec 04 11:03:06 crc kubenswrapper[4693]: I1204 11:03:06.355285 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zjsb7" podStartSLOduration=2.970340656 podStartE2EDuration="5.355263495s" podCreationTimestamp="2025-12-04 11:03:01 +0000 UTC" firstStartedPulling="2025-12-04 11:03:03.300828013 +0000 UTC m=+4829.198421766" lastFinishedPulling="2025-12-04 11:03:05.685750862 +0000 UTC m=+4831.583344605" observedRunningTime="2025-12-04 11:03:06.346579621 +0000 UTC m=+4832.244173384" watchObservedRunningTime="2025-12-04 11:03:06.355263495 +0000 UTC m=+4832.252857248" Dec 04 11:03:11 crc kubenswrapper[4693]: I1204 11:03:11.970423 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zjsb7" Dec 04 11:03:11 crc kubenswrapper[4693]: I1204 11:03:11.970964 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zjsb7" Dec 04 11:03:12 crc kubenswrapper[4693]: I1204 11:03:12.019243 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zjsb7" Dec 04 11:03:12 crc kubenswrapper[4693]: I1204 11:03:12.424990 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zjsb7" Dec 04 11:03:12 crc kubenswrapper[4693]: I1204 11:03:12.472719 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zjsb7"] Dec 04 11:03:14 crc kubenswrapper[4693]: I1204 11:03:14.392900 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zjsb7" podUID="1f7c8266-2f13-4bf0-9d13-73062537290d" containerName="registry-server" containerID="cri-o://5a69a0580af54375c50d8f566410e01bc2080c182f6e8888eff8a913f514afab" gracePeriod=2 Dec 04 11:03:14 crc kubenswrapper[4693]: I1204 11:03:14.960235 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zjsb7" Dec 04 11:03:14 crc kubenswrapper[4693]: I1204 11:03:14.977381 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7c8266-2f13-4bf0-9d13-73062537290d-catalog-content\") pod \"1f7c8266-2f13-4bf0-9d13-73062537290d\" (UID: \"1f7c8266-2f13-4bf0-9d13-73062537290d\") " Dec 04 11:03:14 crc kubenswrapper[4693]: I1204 11:03:14.977539 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nzpx\" (UniqueName: \"kubernetes.io/projected/1f7c8266-2f13-4bf0-9d13-73062537290d-kube-api-access-4nzpx\") pod \"1f7c8266-2f13-4bf0-9d13-73062537290d\" (UID: \"1f7c8266-2f13-4bf0-9d13-73062537290d\") " Dec 04 11:03:14 crc kubenswrapper[4693]: I1204 11:03:14.977696 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7c8266-2f13-4bf0-9d13-73062537290d-utilities\") pod \"1f7c8266-2f13-4bf0-9d13-73062537290d\" (UID: \"1f7c8266-2f13-4bf0-9d13-73062537290d\") " Dec 04 11:03:14 crc kubenswrapper[4693]: I1204 11:03:14.979036 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f7c8266-2f13-4bf0-9d13-73062537290d-utilities" (OuterVolumeSpecName: "utilities") pod "1f7c8266-2f13-4bf0-9d13-73062537290d" (UID: "1f7c8266-2f13-4bf0-9d13-73062537290d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:03:14 crc kubenswrapper[4693]: I1204 11:03:14.983625 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f7c8266-2f13-4bf0-9d13-73062537290d-kube-api-access-4nzpx" (OuterVolumeSpecName: "kube-api-access-4nzpx") pod "1f7c8266-2f13-4bf0-9d13-73062537290d" (UID: "1f7c8266-2f13-4bf0-9d13-73062537290d"). InnerVolumeSpecName "kube-api-access-4nzpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:03:15 crc kubenswrapper[4693]: I1204 11:03:15.080105 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nzpx\" (UniqueName: \"kubernetes.io/projected/1f7c8266-2f13-4bf0-9d13-73062537290d-kube-api-access-4nzpx\") on node \"crc\" DevicePath \"\"" Dec 04 11:03:15 crc kubenswrapper[4693]: I1204 11:03:15.080310 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7c8266-2f13-4bf0-9d13-73062537290d-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:03:15 crc kubenswrapper[4693]: I1204 11:03:15.326460 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f7c8266-2f13-4bf0-9d13-73062537290d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f7c8266-2f13-4bf0-9d13-73062537290d" (UID: "1f7c8266-2f13-4bf0-9d13-73062537290d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:03:15 crc kubenswrapper[4693]: I1204 11:03:15.385611 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7c8266-2f13-4bf0-9d13-73062537290d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:03:15 crc kubenswrapper[4693]: I1204 11:03:15.404936 4693 generic.go:334] "Generic (PLEG): container finished" podID="1f7c8266-2f13-4bf0-9d13-73062537290d" containerID="5a69a0580af54375c50d8f566410e01bc2080c182f6e8888eff8a913f514afab" exitCode=0 Dec 04 11:03:15 crc kubenswrapper[4693]: I1204 11:03:15.404987 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zjsb7" event={"ID":"1f7c8266-2f13-4bf0-9d13-73062537290d","Type":"ContainerDied","Data":"5a69a0580af54375c50d8f566410e01bc2080c182f6e8888eff8a913f514afab"} Dec 04 11:03:15 crc kubenswrapper[4693]: I1204 11:03:15.405002 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zjsb7" Dec 04 11:03:15 crc kubenswrapper[4693]: I1204 11:03:15.405025 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zjsb7" event={"ID":"1f7c8266-2f13-4bf0-9d13-73062537290d","Type":"ContainerDied","Data":"1b0d79c591311ef5e79a32738f0ff5fddf8c1fa4c792b009e7730cf5eabf4bae"} Dec 04 11:03:15 crc kubenswrapper[4693]: I1204 11:03:15.405045 4693 scope.go:117] "RemoveContainer" containerID="5a69a0580af54375c50d8f566410e01bc2080c182f6e8888eff8a913f514afab" Dec 04 11:03:15 crc kubenswrapper[4693]: I1204 11:03:15.446867 4693 scope.go:117] "RemoveContainer" containerID="47b11dd41a1979a0f0b5fd9db53c27b50f1ef50fa74adc6ea982e7d6dab4e1a2" Dec 04 11:03:15 crc kubenswrapper[4693]: I1204 11:03:15.458702 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zjsb7"] Dec 04 11:03:15 crc kubenswrapper[4693]: I1204 11:03:15.469001 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zjsb7"] Dec 04 11:03:15 crc kubenswrapper[4693]: I1204 11:03:15.477683 4693 scope.go:117] "RemoveContainer" containerID="5a211bc06c7e54e4bb668f2c378e4bb68192ee31ea80d21f8a9b1041e5b84bb2" Dec 04 11:03:15 crc kubenswrapper[4693]: I1204 11:03:15.544475 4693 scope.go:117] "RemoveContainer" containerID="5a69a0580af54375c50d8f566410e01bc2080c182f6e8888eff8a913f514afab" Dec 04 11:03:15 crc kubenswrapper[4693]: E1204 11:03:15.545178 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a69a0580af54375c50d8f566410e01bc2080c182f6e8888eff8a913f514afab\": container with ID starting with 5a69a0580af54375c50d8f566410e01bc2080c182f6e8888eff8a913f514afab not found: ID does not exist" containerID="5a69a0580af54375c50d8f566410e01bc2080c182f6e8888eff8a913f514afab" Dec 04 11:03:15 crc kubenswrapper[4693]: I1204 11:03:15.545215 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a69a0580af54375c50d8f566410e01bc2080c182f6e8888eff8a913f514afab"} err="failed to get container status \"5a69a0580af54375c50d8f566410e01bc2080c182f6e8888eff8a913f514afab\": rpc error: code = NotFound desc = could not find container \"5a69a0580af54375c50d8f566410e01bc2080c182f6e8888eff8a913f514afab\": container with ID starting with 5a69a0580af54375c50d8f566410e01bc2080c182f6e8888eff8a913f514afab not found: ID does not exist" Dec 04 11:03:15 crc kubenswrapper[4693]: I1204 11:03:15.545236 4693 scope.go:117] "RemoveContainer" containerID="47b11dd41a1979a0f0b5fd9db53c27b50f1ef50fa74adc6ea982e7d6dab4e1a2" Dec 04 11:03:15 crc kubenswrapper[4693]: E1204 11:03:15.545621 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47b11dd41a1979a0f0b5fd9db53c27b50f1ef50fa74adc6ea982e7d6dab4e1a2\": container with ID starting with 47b11dd41a1979a0f0b5fd9db53c27b50f1ef50fa74adc6ea982e7d6dab4e1a2 not found: ID does not exist" containerID="47b11dd41a1979a0f0b5fd9db53c27b50f1ef50fa74adc6ea982e7d6dab4e1a2" Dec 04 11:03:15 crc kubenswrapper[4693]: I1204 11:03:15.545675 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47b11dd41a1979a0f0b5fd9db53c27b50f1ef50fa74adc6ea982e7d6dab4e1a2"} err="failed to get container status \"47b11dd41a1979a0f0b5fd9db53c27b50f1ef50fa74adc6ea982e7d6dab4e1a2\": rpc error: code = NotFound desc = could not find container \"47b11dd41a1979a0f0b5fd9db53c27b50f1ef50fa74adc6ea982e7d6dab4e1a2\": container with ID starting with 47b11dd41a1979a0f0b5fd9db53c27b50f1ef50fa74adc6ea982e7d6dab4e1a2 not found: ID does not exist" Dec 04 11:03:15 crc kubenswrapper[4693]: I1204 11:03:15.545712 4693 scope.go:117] "RemoveContainer" containerID="5a211bc06c7e54e4bb668f2c378e4bb68192ee31ea80d21f8a9b1041e5b84bb2" Dec 04 11:03:15 crc kubenswrapper[4693]: E1204 11:03:15.546007 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a211bc06c7e54e4bb668f2c378e4bb68192ee31ea80d21f8a9b1041e5b84bb2\": container with ID starting with 5a211bc06c7e54e4bb668f2c378e4bb68192ee31ea80d21f8a9b1041e5b84bb2 not found: ID does not exist" containerID="5a211bc06c7e54e4bb668f2c378e4bb68192ee31ea80d21f8a9b1041e5b84bb2" Dec 04 11:03:15 crc kubenswrapper[4693]: I1204 11:03:15.546028 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a211bc06c7e54e4bb668f2c378e4bb68192ee31ea80d21f8a9b1041e5b84bb2"} err="failed to get container status \"5a211bc06c7e54e4bb668f2c378e4bb68192ee31ea80d21f8a9b1041e5b84bb2\": rpc error: code = NotFound desc = could not find container \"5a211bc06c7e54e4bb668f2c378e4bb68192ee31ea80d21f8a9b1041e5b84bb2\": container with ID starting with 5a211bc06c7e54e4bb668f2c378e4bb68192ee31ea80d21f8a9b1041e5b84bb2 not found: ID does not exist" Dec 04 11:03:16 crc kubenswrapper[4693]: I1204 11:03:16.471412 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f7c8266-2f13-4bf0-9d13-73062537290d" path="/var/lib/kubelet/pods/1f7c8266-2f13-4bf0-9d13-73062537290d/volumes" Dec 04 11:04:52 crc kubenswrapper[4693]: I1204 11:04:52.272407 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:04:52 crc kubenswrapper[4693]: I1204 11:04:52.272993 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:05:22 crc kubenswrapper[4693]: I1204 11:05:22.273637 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:05:22 crc kubenswrapper[4693]: I1204 11:05:22.274402 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:05:52 crc kubenswrapper[4693]: I1204 11:05:52.273268 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:05:52 crc kubenswrapper[4693]: I1204 11:05:52.274579 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:05:52 crc kubenswrapper[4693]: I1204 11:05:52.274665 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 11:05:52 crc kubenswrapper[4693]: I1204 11:05:52.276114 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d66556eececfe05300b379ec0667fc9849d19d6cada67a21fc7c4bd62f67fea2"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 11:05:52 crc kubenswrapper[4693]: I1204 11:05:52.276218 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://d66556eececfe05300b379ec0667fc9849d19d6cada67a21fc7c4bd62f67fea2" gracePeriod=600 Dec 04 11:05:52 crc kubenswrapper[4693]: I1204 11:05:52.827580 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="d66556eececfe05300b379ec0667fc9849d19d6cada67a21fc7c4bd62f67fea2" exitCode=0 Dec 04 11:05:52 crc kubenswrapper[4693]: I1204 11:05:52.827656 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"d66556eececfe05300b379ec0667fc9849d19d6cada67a21fc7c4bd62f67fea2"} Dec 04 11:05:52 crc kubenswrapper[4693]: I1204 11:05:52.827936 4693 scope.go:117] "RemoveContainer" containerID="964445ce07aa791d844882d620b42cd5bbc2deae4c905d4946d6c1ea436c952d" Dec 04 11:05:54 crc kubenswrapper[4693]: I1204 11:05:54.851911 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836"} Dec 04 11:08:22 crc kubenswrapper[4693]: I1204 11:08:22.273109 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:08:22 crc kubenswrapper[4693]: I1204 11:08:22.273690 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:08:52 crc kubenswrapper[4693]: I1204 11:08:52.273260 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:08:52 crc kubenswrapper[4693]: I1204 11:08:52.273905 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:09:22 crc kubenswrapper[4693]: I1204 11:09:22.273591 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:09:22 crc kubenswrapper[4693]: I1204 11:09:22.274209 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:09:22 crc kubenswrapper[4693]: I1204 11:09:22.274260 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 11:09:22 crc kubenswrapper[4693]: I1204 11:09:22.275000 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 11:09:22 crc kubenswrapper[4693]: I1204 11:09:22.275048 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" gracePeriod=600 Dec 04 11:09:22 crc kubenswrapper[4693]: E1204 11:09:22.393932 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:09:22 crc kubenswrapper[4693]: I1204 11:09:22.689953 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" exitCode=0 Dec 04 11:09:22 crc kubenswrapper[4693]: I1204 11:09:22.690003 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836"} Dec 04 11:09:22 crc kubenswrapper[4693]: I1204 11:09:22.690596 4693 scope.go:117] "RemoveContainer" containerID="d66556eececfe05300b379ec0667fc9849d19d6cada67a21fc7c4bd62f67fea2" Dec 04 11:09:22 crc kubenswrapper[4693]: I1204 11:09:22.693559 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:09:22 crc kubenswrapper[4693]: E1204 11:09:22.693953 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:09:33 crc kubenswrapper[4693]: I1204 11:09:33.460753 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:09:33 crc kubenswrapper[4693]: E1204 11:09:33.461519 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:09:45 crc kubenswrapper[4693]: I1204 11:09:45.461442 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:09:45 crc kubenswrapper[4693]: E1204 11:09:45.462420 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:09:58 crc kubenswrapper[4693]: I1204 11:09:58.462063 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:09:58 crc kubenswrapper[4693]: E1204 11:09:58.462797 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:10:13 crc kubenswrapper[4693]: I1204 11:10:13.460901 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:10:13 crc kubenswrapper[4693]: E1204 11:10:13.461911 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:10:21 crc kubenswrapper[4693]: I1204 11:10:21.248067 4693 generic.go:334] "Generic (PLEG): container finished" podID="da1a36ac-5a8b-475d-8434-eb43b0f8a656" containerID="5c5fd62ba406b129efbb25274fbbbd1e4f012e1ffadfa270f6bbfd59997db366" exitCode=0 Dec 04 11:10:21 crc kubenswrapper[4693]: I1204 11:10:21.248694 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"da1a36ac-5a8b-475d-8434-eb43b0f8a656","Type":"ContainerDied","Data":"5c5fd62ba406b129efbb25274fbbbd1e4f012e1ffadfa270f6bbfd59997db366"} Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.648309 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.760398 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/da1a36ac-5a8b-475d-8434-eb43b0f8a656-ca-certs\") pod \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.760440 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da1a36ac-5a8b-475d-8434-eb43b0f8a656-openstack-config\") pod \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.760459 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.760483 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/da1a36ac-5a8b-475d-8434-eb43b0f8a656-test-operator-ephemeral-workdir\") pod \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.760556 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da1a36ac-5a8b-475d-8434-eb43b0f8a656-ssh-key\") pod \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.760596 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da1a36ac-5a8b-475d-8434-eb43b0f8a656-config-data\") pod \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.760618 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/da1a36ac-5a8b-475d-8434-eb43b0f8a656-test-operator-ephemeral-temporary\") pod \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.760703 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97fcp\" (UniqueName: \"kubernetes.io/projected/da1a36ac-5a8b-475d-8434-eb43b0f8a656-kube-api-access-97fcp\") pod \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.760756 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da1a36ac-5a8b-475d-8434-eb43b0f8a656-openstack-config-secret\") pod \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\" (UID: \"da1a36ac-5a8b-475d-8434-eb43b0f8a656\") " Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.761495 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da1a36ac-5a8b-475d-8434-eb43b0f8a656-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "da1a36ac-5a8b-475d-8434-eb43b0f8a656" (UID: "da1a36ac-5a8b-475d-8434-eb43b0f8a656"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.761679 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da1a36ac-5a8b-475d-8434-eb43b0f8a656-config-data" (OuterVolumeSpecName: "config-data") pod "da1a36ac-5a8b-475d-8434-eb43b0f8a656" (UID: "da1a36ac-5a8b-475d-8434-eb43b0f8a656"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.766961 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da1a36ac-5a8b-475d-8434-eb43b0f8a656-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "da1a36ac-5a8b-475d-8434-eb43b0f8a656" (UID: "da1a36ac-5a8b-475d-8434-eb43b0f8a656"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.771345 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da1a36ac-5a8b-475d-8434-eb43b0f8a656-kube-api-access-97fcp" (OuterVolumeSpecName: "kube-api-access-97fcp") pod "da1a36ac-5a8b-475d-8434-eb43b0f8a656" (UID: "da1a36ac-5a8b-475d-8434-eb43b0f8a656"). InnerVolumeSpecName "kube-api-access-97fcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.776311 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "test-operator-logs") pod "da1a36ac-5a8b-475d-8434-eb43b0f8a656" (UID: "da1a36ac-5a8b-475d-8434-eb43b0f8a656"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.791601 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1a36ac-5a8b-475d-8434-eb43b0f8a656-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "da1a36ac-5a8b-475d-8434-eb43b0f8a656" (UID: "da1a36ac-5a8b-475d-8434-eb43b0f8a656"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.796663 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1a36ac-5a8b-475d-8434-eb43b0f8a656-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "da1a36ac-5a8b-475d-8434-eb43b0f8a656" (UID: "da1a36ac-5a8b-475d-8434-eb43b0f8a656"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.803756 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1a36ac-5a8b-475d-8434-eb43b0f8a656-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "da1a36ac-5a8b-475d-8434-eb43b0f8a656" (UID: "da1a36ac-5a8b-475d-8434-eb43b0f8a656"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.819030 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da1a36ac-5a8b-475d-8434-eb43b0f8a656-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "da1a36ac-5a8b-475d-8434-eb43b0f8a656" (UID: "da1a36ac-5a8b-475d-8434-eb43b0f8a656"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.863927 4693 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da1a36ac-5a8b-475d-8434-eb43b0f8a656-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.864025 4693 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/da1a36ac-5a8b-475d-8434-eb43b0f8a656-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.864091 4693 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da1a36ac-5a8b-475d-8434-eb43b0f8a656-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.864180 4693 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.864538 4693 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/da1a36ac-5a8b-475d-8434-eb43b0f8a656-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.864923 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da1a36ac-5a8b-475d-8434-eb43b0f8a656-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.865042 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da1a36ac-5a8b-475d-8434-eb43b0f8a656-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.865174 4693 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/da1a36ac-5a8b-475d-8434-eb43b0f8a656-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.865268 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97fcp\" (UniqueName: \"kubernetes.io/projected/da1a36ac-5a8b-475d-8434-eb43b0f8a656-kube-api-access-97fcp\") on node \"crc\" DevicePath \"\"" Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.892060 4693 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 04 11:10:22 crc kubenswrapper[4693]: I1204 11:10:22.967553 4693 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 04 11:10:23 crc kubenswrapper[4693]: I1204 11:10:23.265358 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"da1a36ac-5a8b-475d-8434-eb43b0f8a656","Type":"ContainerDied","Data":"bbb55b035441f891046b061192a148c683adbc49b60e4b515d4735944ee45b77"} Dec 04 11:10:23 crc kubenswrapper[4693]: I1204 11:10:23.265401 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbb55b035441f891046b061192a148c683adbc49b60e4b515d4735944ee45b77" Dec 04 11:10:23 crc kubenswrapper[4693]: I1204 11:10:23.265407 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 11:10:26 crc kubenswrapper[4693]: I1204 11:10:26.007104 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 04 11:10:26 crc kubenswrapper[4693]: E1204 11:10:26.008259 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da1a36ac-5a8b-475d-8434-eb43b0f8a656" containerName="tempest-tests-tempest-tests-runner" Dec 04 11:10:26 crc kubenswrapper[4693]: I1204 11:10:26.008278 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1a36ac-5a8b-475d-8434-eb43b0f8a656" containerName="tempest-tests-tempest-tests-runner" Dec 04 11:10:26 crc kubenswrapper[4693]: E1204 11:10:26.008295 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7c8266-2f13-4bf0-9d13-73062537290d" containerName="registry-server" Dec 04 11:10:26 crc kubenswrapper[4693]: I1204 11:10:26.008301 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7c8266-2f13-4bf0-9d13-73062537290d" containerName="registry-server" Dec 04 11:10:26 crc kubenswrapper[4693]: E1204 11:10:26.008313 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7c8266-2f13-4bf0-9d13-73062537290d" containerName="extract-content" Dec 04 11:10:26 crc kubenswrapper[4693]: I1204 11:10:26.008321 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7c8266-2f13-4bf0-9d13-73062537290d" containerName="extract-content" Dec 04 11:10:26 crc kubenswrapper[4693]: E1204 11:10:26.008355 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7c8266-2f13-4bf0-9d13-73062537290d" containerName="extract-utilities" Dec 04 11:10:26 crc kubenswrapper[4693]: I1204 11:10:26.008364 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7c8266-2f13-4bf0-9d13-73062537290d" containerName="extract-utilities" Dec 04 11:10:26 crc kubenswrapper[4693]: I1204 11:10:26.008593 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7c8266-2f13-4bf0-9d13-73062537290d" containerName="registry-server" Dec 04 11:10:26 crc kubenswrapper[4693]: I1204 11:10:26.008615 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="da1a36ac-5a8b-475d-8434-eb43b0f8a656" containerName="tempest-tests-tempest-tests-runner" Dec 04 11:10:26 crc kubenswrapper[4693]: I1204 11:10:26.009378 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 11:10:26 crc kubenswrapper[4693]: I1204 11:10:26.015443 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 04 11:10:26 crc kubenswrapper[4693]: I1204 11:10:26.050292 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0e4ff807-907a-4310-97c4-7e60e55dcaca\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 11:10:26 crc kubenswrapper[4693]: I1204 11:10:26.050366 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2hxd\" (UniqueName: \"kubernetes.io/projected/0e4ff807-907a-4310-97c4-7e60e55dcaca-kube-api-access-w2hxd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0e4ff807-907a-4310-97c4-7e60e55dcaca\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 11:10:26 crc kubenswrapper[4693]: I1204 11:10:26.151389 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0e4ff807-907a-4310-97c4-7e60e55dcaca\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 11:10:26 crc kubenswrapper[4693]: I1204 11:10:26.151449 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2hxd\" (UniqueName: \"kubernetes.io/projected/0e4ff807-907a-4310-97c4-7e60e55dcaca-kube-api-access-w2hxd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0e4ff807-907a-4310-97c4-7e60e55dcaca\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 11:10:26 crc kubenswrapper[4693]: I1204 11:10:26.151942 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0e4ff807-907a-4310-97c4-7e60e55dcaca\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 11:10:26 crc kubenswrapper[4693]: I1204 11:10:26.170602 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2hxd\" (UniqueName: \"kubernetes.io/projected/0e4ff807-907a-4310-97c4-7e60e55dcaca-kube-api-access-w2hxd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0e4ff807-907a-4310-97c4-7e60e55dcaca\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 11:10:26 crc kubenswrapper[4693]: I1204 11:10:26.183938 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0e4ff807-907a-4310-97c4-7e60e55dcaca\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 11:10:26 crc kubenswrapper[4693]: I1204 11:10:26.363634 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 11:10:26 crc kubenswrapper[4693]: I1204 11:10:26.804307 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 04 11:10:26 crc kubenswrapper[4693]: I1204 11:10:26.809078 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 11:10:27 crc kubenswrapper[4693]: I1204 11:10:27.298115 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0e4ff807-907a-4310-97c4-7e60e55dcaca","Type":"ContainerStarted","Data":"5841d62ed5f81e75fbf03cb1a252b563c104994b7a0653a12e19626af1afad78"} Dec 04 11:10:27 crc kubenswrapper[4693]: I1204 11:10:27.462770 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:10:27 crc kubenswrapper[4693]: E1204 11:10:27.463291 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:10:28 crc kubenswrapper[4693]: I1204 11:10:28.307766 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0e4ff807-907a-4310-97c4-7e60e55dcaca","Type":"ContainerStarted","Data":"5018f9bc5708e4c152720273232654504951fb8a79ad98f256032095782ca00e"} Dec 04 11:10:28 crc kubenswrapper[4693]: I1204 11:10:28.334825 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.485594253 podStartE2EDuration="3.33479601s" podCreationTimestamp="2025-12-04 11:10:25 +0000 UTC" firstStartedPulling="2025-12-04 11:10:26.808870261 +0000 UTC m=+5272.706464014" lastFinishedPulling="2025-12-04 11:10:27.658072008 +0000 UTC m=+5273.555665771" observedRunningTime="2025-12-04 11:10:28.324458403 +0000 UTC m=+5274.222052156" watchObservedRunningTime="2025-12-04 11:10:28.33479601 +0000 UTC m=+5274.232389773" Dec 04 11:10:42 crc kubenswrapper[4693]: I1204 11:10:42.461616 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:10:42 crc kubenswrapper[4693]: E1204 11:10:42.462524 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:10:52 crc kubenswrapper[4693]: I1204 11:10:52.673806 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zljj5/must-gather-8jmx2"] Dec 04 11:10:52 crc kubenswrapper[4693]: I1204 11:10:52.676921 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zljj5/must-gather-8jmx2" Dec 04 11:10:52 crc kubenswrapper[4693]: I1204 11:10:52.678747 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-zljj5"/"default-dockercfg-tmns2" Dec 04 11:10:52 crc kubenswrapper[4693]: I1204 11:10:52.678790 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zljj5"/"openshift-service-ca.crt" Dec 04 11:10:52 crc kubenswrapper[4693]: I1204 11:10:52.680536 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zljj5"/"kube-root-ca.crt" Dec 04 11:10:52 crc kubenswrapper[4693]: I1204 11:10:52.687383 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zljj5/must-gather-8jmx2"] Dec 04 11:10:52 crc kubenswrapper[4693]: I1204 11:10:52.736378 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwksz\" (UniqueName: \"kubernetes.io/projected/ef96b9a9-00d5-410f-a233-c4df68302d90-kube-api-access-nwksz\") pod \"must-gather-8jmx2\" (UID: \"ef96b9a9-00d5-410f-a233-c4df68302d90\") " pod="openshift-must-gather-zljj5/must-gather-8jmx2" Dec 04 11:10:52 crc kubenswrapper[4693]: I1204 11:10:52.736470 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ef96b9a9-00d5-410f-a233-c4df68302d90-must-gather-output\") pod \"must-gather-8jmx2\" (UID: \"ef96b9a9-00d5-410f-a233-c4df68302d90\") " pod="openshift-must-gather-zljj5/must-gather-8jmx2" Dec 04 11:10:52 crc kubenswrapper[4693]: I1204 11:10:52.838459 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwksz\" (UniqueName: \"kubernetes.io/projected/ef96b9a9-00d5-410f-a233-c4df68302d90-kube-api-access-nwksz\") pod \"must-gather-8jmx2\" (UID: \"ef96b9a9-00d5-410f-a233-c4df68302d90\") " pod="openshift-must-gather-zljj5/must-gather-8jmx2" Dec 04 11:10:52 crc kubenswrapper[4693]: I1204 11:10:52.838530 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ef96b9a9-00d5-410f-a233-c4df68302d90-must-gather-output\") pod \"must-gather-8jmx2\" (UID: \"ef96b9a9-00d5-410f-a233-c4df68302d90\") " pod="openshift-must-gather-zljj5/must-gather-8jmx2" Dec 04 11:10:52 crc kubenswrapper[4693]: I1204 11:10:52.839005 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ef96b9a9-00d5-410f-a233-c4df68302d90-must-gather-output\") pod \"must-gather-8jmx2\" (UID: \"ef96b9a9-00d5-410f-a233-c4df68302d90\") " pod="openshift-must-gather-zljj5/must-gather-8jmx2" Dec 04 11:10:52 crc kubenswrapper[4693]: I1204 11:10:52.863007 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwksz\" (UniqueName: \"kubernetes.io/projected/ef96b9a9-00d5-410f-a233-c4df68302d90-kube-api-access-nwksz\") pod \"must-gather-8jmx2\" (UID: \"ef96b9a9-00d5-410f-a233-c4df68302d90\") " pod="openshift-must-gather-zljj5/must-gather-8jmx2" Dec 04 11:10:52 crc kubenswrapper[4693]: I1204 11:10:52.995720 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zljj5/must-gather-8jmx2" Dec 04 11:10:53 crc kubenswrapper[4693]: I1204 11:10:53.433062 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zljj5/must-gather-8jmx2"] Dec 04 11:10:53 crc kubenswrapper[4693]: I1204 11:10:53.608540 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zljj5/must-gather-8jmx2" event={"ID":"ef96b9a9-00d5-410f-a233-c4df68302d90","Type":"ContainerStarted","Data":"7356e4e489f8bebc6d0f513be1febf1459958d15e3e78bbe3a83deecbfac29c0"} Dec 04 11:10:56 crc kubenswrapper[4693]: I1204 11:10:56.462128 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:10:56 crc kubenswrapper[4693]: E1204 11:10:56.462767 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:10:58 crc kubenswrapper[4693]: I1204 11:10:58.672105 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zljj5/must-gather-8jmx2" event={"ID":"ef96b9a9-00d5-410f-a233-c4df68302d90","Type":"ContainerStarted","Data":"b91533a9e6f4e4172ac4363b1c60c0d495cafd811e2d7a19867d53a678edbbcb"} Dec 04 11:10:58 crc kubenswrapper[4693]: I1204 11:10:58.672707 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zljj5/must-gather-8jmx2" event={"ID":"ef96b9a9-00d5-410f-a233-c4df68302d90","Type":"ContainerStarted","Data":"cf93b52f8706c2d851e302cfcf7ceae09c281b25ecab84c9716b3a1e38848696"} Dec 04 11:11:03 crc kubenswrapper[4693]: I1204 11:11:03.864964 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zljj5/must-gather-8jmx2" podStartSLOduration=7.987861177 podStartE2EDuration="11.86494407s" podCreationTimestamp="2025-12-04 11:10:52 +0000 UTC" firstStartedPulling="2025-12-04 11:10:53.446726604 +0000 UTC m=+5299.344320357" lastFinishedPulling="2025-12-04 11:10:57.323809497 +0000 UTC m=+5303.221403250" observedRunningTime="2025-12-04 11:10:58.688685914 +0000 UTC m=+5304.586279697" watchObservedRunningTime="2025-12-04 11:11:03.86494407 +0000 UTC m=+5309.762537813" Dec 04 11:11:03 crc kubenswrapper[4693]: I1204 11:11:03.867150 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zljj5/crc-debug-t5rtp"] Dec 04 11:11:03 crc kubenswrapper[4693]: I1204 11:11:03.868484 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zljj5/crc-debug-t5rtp" Dec 04 11:11:03 crc kubenswrapper[4693]: I1204 11:11:03.970596 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d7422b9d-65aa-42f2-afb3-4b26544a1434-host\") pod \"crc-debug-t5rtp\" (UID: \"d7422b9d-65aa-42f2-afb3-4b26544a1434\") " pod="openshift-must-gather-zljj5/crc-debug-t5rtp" Dec 04 11:11:03 crc kubenswrapper[4693]: I1204 11:11:03.971104 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n7xs\" (UniqueName: \"kubernetes.io/projected/d7422b9d-65aa-42f2-afb3-4b26544a1434-kube-api-access-2n7xs\") pod \"crc-debug-t5rtp\" (UID: \"d7422b9d-65aa-42f2-afb3-4b26544a1434\") " pod="openshift-must-gather-zljj5/crc-debug-t5rtp" Dec 04 11:11:04 crc kubenswrapper[4693]: I1204 11:11:04.073029 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n7xs\" (UniqueName: \"kubernetes.io/projected/d7422b9d-65aa-42f2-afb3-4b26544a1434-kube-api-access-2n7xs\") pod \"crc-debug-t5rtp\" (UID: \"d7422b9d-65aa-42f2-afb3-4b26544a1434\") " pod="openshift-must-gather-zljj5/crc-debug-t5rtp" Dec 04 11:11:04 crc kubenswrapper[4693]: I1204 11:11:04.073116 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d7422b9d-65aa-42f2-afb3-4b26544a1434-host\") pod \"crc-debug-t5rtp\" (UID: \"d7422b9d-65aa-42f2-afb3-4b26544a1434\") " pod="openshift-must-gather-zljj5/crc-debug-t5rtp" Dec 04 11:11:04 crc kubenswrapper[4693]: I1204 11:11:04.073248 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d7422b9d-65aa-42f2-afb3-4b26544a1434-host\") pod \"crc-debug-t5rtp\" (UID: \"d7422b9d-65aa-42f2-afb3-4b26544a1434\") " pod="openshift-must-gather-zljj5/crc-debug-t5rtp" Dec 04 11:11:04 crc kubenswrapper[4693]: I1204 11:11:04.109067 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n7xs\" (UniqueName: \"kubernetes.io/projected/d7422b9d-65aa-42f2-afb3-4b26544a1434-kube-api-access-2n7xs\") pod \"crc-debug-t5rtp\" (UID: \"d7422b9d-65aa-42f2-afb3-4b26544a1434\") " pod="openshift-must-gather-zljj5/crc-debug-t5rtp" Dec 04 11:11:04 crc kubenswrapper[4693]: I1204 11:11:04.189189 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zljj5/crc-debug-t5rtp" Dec 04 11:11:04 crc kubenswrapper[4693]: I1204 11:11:04.730757 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zljj5/crc-debug-t5rtp" event={"ID":"d7422b9d-65aa-42f2-afb3-4b26544a1434","Type":"ContainerStarted","Data":"d8b1d9bcbaeaa637c3717b830863db25f5d8113d2ff575bbe04d6f0050b0de95"} Dec 04 11:11:11 crc kubenswrapper[4693]: I1204 11:11:11.461356 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:11:11 crc kubenswrapper[4693]: E1204 11:11:11.462039 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:11:16 crc kubenswrapper[4693]: I1204 11:11:16.870491 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zljj5/crc-debug-t5rtp" event={"ID":"d7422b9d-65aa-42f2-afb3-4b26544a1434","Type":"ContainerStarted","Data":"d9430d8b6ade17028afabc35feb097d4f94fc94e78a81f4a27f37ae7978b5ab5"} Dec 04 11:11:16 crc kubenswrapper[4693]: I1204 11:11:16.899808 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zljj5/crc-debug-t5rtp" podStartSLOduration=1.776233404 podStartE2EDuration="13.89979166s" podCreationTimestamp="2025-12-04 11:11:03 +0000 UTC" firstStartedPulling="2025-12-04 11:11:04.225168589 +0000 UTC m=+5310.122762342" lastFinishedPulling="2025-12-04 11:11:16.348726845 +0000 UTC m=+5322.246320598" observedRunningTime="2025-12-04 11:11:16.883015072 +0000 UTC m=+5322.780608825" watchObservedRunningTime="2025-12-04 11:11:16.89979166 +0000 UTC m=+5322.797385413" Dec 04 11:11:24 crc kubenswrapper[4693]: I1204 11:11:24.469586 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:11:24 crc kubenswrapper[4693]: E1204 11:11:24.470560 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:11:39 crc kubenswrapper[4693]: I1204 11:11:39.462099 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:11:39 crc kubenswrapper[4693]: E1204 11:11:39.464475 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:11:51 crc kubenswrapper[4693]: I1204 11:11:51.461176 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:11:51 crc kubenswrapper[4693]: E1204 11:11:51.461993 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:12:03 crc kubenswrapper[4693]: I1204 11:12:03.328206 4693 generic.go:334] "Generic (PLEG): container finished" podID="d7422b9d-65aa-42f2-afb3-4b26544a1434" containerID="d9430d8b6ade17028afabc35feb097d4f94fc94e78a81f4a27f37ae7978b5ab5" exitCode=0 Dec 04 11:12:03 crc kubenswrapper[4693]: I1204 11:12:03.328286 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zljj5/crc-debug-t5rtp" event={"ID":"d7422b9d-65aa-42f2-afb3-4b26544a1434","Type":"ContainerDied","Data":"d9430d8b6ade17028afabc35feb097d4f94fc94e78a81f4a27f37ae7978b5ab5"} Dec 04 11:12:04 crc kubenswrapper[4693]: I1204 11:12:04.473495 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zljj5/crc-debug-t5rtp" Dec 04 11:12:04 crc kubenswrapper[4693]: I1204 11:12:04.512446 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zljj5/crc-debug-t5rtp"] Dec 04 11:12:04 crc kubenswrapper[4693]: I1204 11:12:04.521083 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zljj5/crc-debug-t5rtp"] Dec 04 11:12:04 crc kubenswrapper[4693]: I1204 11:12:04.603558 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d7422b9d-65aa-42f2-afb3-4b26544a1434-host\") pod \"d7422b9d-65aa-42f2-afb3-4b26544a1434\" (UID: \"d7422b9d-65aa-42f2-afb3-4b26544a1434\") " Dec 04 11:12:04 crc kubenswrapper[4693]: I1204 11:12:04.603720 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d7422b9d-65aa-42f2-afb3-4b26544a1434-host" (OuterVolumeSpecName: "host") pod "d7422b9d-65aa-42f2-afb3-4b26544a1434" (UID: "d7422b9d-65aa-42f2-afb3-4b26544a1434"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 11:12:04 crc kubenswrapper[4693]: I1204 11:12:04.604136 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n7xs\" (UniqueName: \"kubernetes.io/projected/d7422b9d-65aa-42f2-afb3-4b26544a1434-kube-api-access-2n7xs\") pod \"d7422b9d-65aa-42f2-afb3-4b26544a1434\" (UID: \"d7422b9d-65aa-42f2-afb3-4b26544a1434\") " Dec 04 11:12:04 crc kubenswrapper[4693]: I1204 11:12:04.605245 4693 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d7422b9d-65aa-42f2-afb3-4b26544a1434-host\") on node \"crc\" DevicePath \"\"" Dec 04 11:12:04 crc kubenswrapper[4693]: I1204 11:12:04.610761 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7422b9d-65aa-42f2-afb3-4b26544a1434-kube-api-access-2n7xs" (OuterVolumeSpecName: "kube-api-access-2n7xs") pod "d7422b9d-65aa-42f2-afb3-4b26544a1434" (UID: "d7422b9d-65aa-42f2-afb3-4b26544a1434"). InnerVolumeSpecName "kube-api-access-2n7xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:12:04 crc kubenswrapper[4693]: I1204 11:12:04.706266 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n7xs\" (UniqueName: \"kubernetes.io/projected/d7422b9d-65aa-42f2-afb3-4b26544a1434-kube-api-access-2n7xs\") on node \"crc\" DevicePath \"\"" Dec 04 11:12:05 crc kubenswrapper[4693]: I1204 11:12:05.346943 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8b1d9bcbaeaa637c3717b830863db25f5d8113d2ff575bbe04d6f0050b0de95" Dec 04 11:12:05 crc kubenswrapper[4693]: I1204 11:12:05.347001 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zljj5/crc-debug-t5rtp" Dec 04 11:12:05 crc kubenswrapper[4693]: I1204 11:12:05.667431 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zljj5/crc-debug-qqqx8"] Dec 04 11:12:05 crc kubenswrapper[4693]: E1204 11:12:05.667874 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7422b9d-65aa-42f2-afb3-4b26544a1434" containerName="container-00" Dec 04 11:12:05 crc kubenswrapper[4693]: I1204 11:12:05.667893 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7422b9d-65aa-42f2-afb3-4b26544a1434" containerName="container-00" Dec 04 11:12:05 crc kubenswrapper[4693]: I1204 11:12:05.668220 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7422b9d-65aa-42f2-afb3-4b26544a1434" containerName="container-00" Dec 04 11:12:05 crc kubenswrapper[4693]: I1204 11:12:05.669025 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zljj5/crc-debug-qqqx8" Dec 04 11:12:05 crc kubenswrapper[4693]: I1204 11:12:05.724624 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-295kj\" (UniqueName: \"kubernetes.io/projected/b2018ccb-bfb0-4d17-b976-260ef56005c7-kube-api-access-295kj\") pod \"crc-debug-qqqx8\" (UID: \"b2018ccb-bfb0-4d17-b976-260ef56005c7\") " pod="openshift-must-gather-zljj5/crc-debug-qqqx8" Dec 04 11:12:05 crc kubenswrapper[4693]: I1204 11:12:05.724816 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2018ccb-bfb0-4d17-b976-260ef56005c7-host\") pod \"crc-debug-qqqx8\" (UID: \"b2018ccb-bfb0-4d17-b976-260ef56005c7\") " pod="openshift-must-gather-zljj5/crc-debug-qqqx8" Dec 04 11:12:05 crc kubenswrapper[4693]: I1204 11:12:05.826869 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2018ccb-bfb0-4d17-b976-260ef56005c7-host\") pod \"crc-debug-qqqx8\" (UID: \"b2018ccb-bfb0-4d17-b976-260ef56005c7\") " pod="openshift-must-gather-zljj5/crc-debug-qqqx8" Dec 04 11:12:05 crc kubenswrapper[4693]: I1204 11:12:05.826956 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-295kj\" (UniqueName: \"kubernetes.io/projected/b2018ccb-bfb0-4d17-b976-260ef56005c7-kube-api-access-295kj\") pod \"crc-debug-qqqx8\" (UID: \"b2018ccb-bfb0-4d17-b976-260ef56005c7\") " pod="openshift-must-gather-zljj5/crc-debug-qqqx8" Dec 04 11:12:05 crc kubenswrapper[4693]: I1204 11:12:05.827025 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2018ccb-bfb0-4d17-b976-260ef56005c7-host\") pod \"crc-debug-qqqx8\" (UID: \"b2018ccb-bfb0-4d17-b976-260ef56005c7\") " pod="openshift-must-gather-zljj5/crc-debug-qqqx8" Dec 04 11:12:05 crc kubenswrapper[4693]: I1204 11:12:05.863580 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-295kj\" (UniqueName: \"kubernetes.io/projected/b2018ccb-bfb0-4d17-b976-260ef56005c7-kube-api-access-295kj\") pod \"crc-debug-qqqx8\" (UID: \"b2018ccb-bfb0-4d17-b976-260ef56005c7\") " pod="openshift-must-gather-zljj5/crc-debug-qqqx8" Dec 04 11:12:05 crc kubenswrapper[4693]: I1204 11:12:05.988582 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zljj5/crc-debug-qqqx8" Dec 04 11:12:06 crc kubenswrapper[4693]: I1204 11:12:06.357180 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zljj5/crc-debug-qqqx8" event={"ID":"b2018ccb-bfb0-4d17-b976-260ef56005c7","Type":"ContainerStarted","Data":"d80c13474cd515121c4756c722872ad5d43b871ad08615aa2a543405522317ce"} Dec 04 11:12:06 crc kubenswrapper[4693]: I1204 11:12:06.357585 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zljj5/crc-debug-qqqx8" event={"ID":"b2018ccb-bfb0-4d17-b976-260ef56005c7","Type":"ContainerStarted","Data":"4a45f80f2f203e58d88eeb45d637dcd9896f1e94f93daf7a21a0779cbfc9f2c9"} Dec 04 11:12:06 crc kubenswrapper[4693]: I1204 11:12:06.369294 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zljj5/crc-debug-qqqx8" podStartSLOduration=1.369275734 podStartE2EDuration="1.369275734s" podCreationTimestamp="2025-12-04 11:12:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 11:12:06.3672386 +0000 UTC m=+5372.264832363" watchObservedRunningTime="2025-12-04 11:12:06.369275734 +0000 UTC m=+5372.266869487" Dec 04 11:12:06 crc kubenswrapper[4693]: I1204 11:12:06.461058 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:12:06 crc kubenswrapper[4693]: E1204 11:12:06.461398 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:12:06 crc kubenswrapper[4693]: I1204 11:12:06.474636 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7422b9d-65aa-42f2-afb3-4b26544a1434" path="/var/lib/kubelet/pods/d7422b9d-65aa-42f2-afb3-4b26544a1434/volumes" Dec 04 11:12:07 crc kubenswrapper[4693]: I1204 11:12:07.370587 4693 generic.go:334] "Generic (PLEG): container finished" podID="b2018ccb-bfb0-4d17-b976-260ef56005c7" containerID="d80c13474cd515121c4756c722872ad5d43b871ad08615aa2a543405522317ce" exitCode=0 Dec 04 11:12:07 crc kubenswrapper[4693]: I1204 11:12:07.370668 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zljj5/crc-debug-qqqx8" event={"ID":"b2018ccb-bfb0-4d17-b976-260ef56005c7","Type":"ContainerDied","Data":"d80c13474cd515121c4756c722872ad5d43b871ad08615aa2a543405522317ce"} Dec 04 11:12:08 crc kubenswrapper[4693]: I1204 11:12:08.511277 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zljj5/crc-debug-qqqx8" Dec 04 11:12:08 crc kubenswrapper[4693]: I1204 11:12:08.575986 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2018ccb-bfb0-4d17-b976-260ef56005c7-host\") pod \"b2018ccb-bfb0-4d17-b976-260ef56005c7\" (UID: \"b2018ccb-bfb0-4d17-b976-260ef56005c7\") " Dec 04 11:12:08 crc kubenswrapper[4693]: I1204 11:12:08.576093 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-295kj\" (UniqueName: \"kubernetes.io/projected/b2018ccb-bfb0-4d17-b976-260ef56005c7-kube-api-access-295kj\") pod \"b2018ccb-bfb0-4d17-b976-260ef56005c7\" (UID: \"b2018ccb-bfb0-4d17-b976-260ef56005c7\") " Dec 04 11:12:08 crc kubenswrapper[4693]: I1204 11:12:08.576141 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2018ccb-bfb0-4d17-b976-260ef56005c7-host" (OuterVolumeSpecName: "host") pod "b2018ccb-bfb0-4d17-b976-260ef56005c7" (UID: "b2018ccb-bfb0-4d17-b976-260ef56005c7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 11:12:08 crc kubenswrapper[4693]: I1204 11:12:08.576839 4693 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2018ccb-bfb0-4d17-b976-260ef56005c7-host\") on node \"crc\" DevicePath \"\"" Dec 04 11:12:08 crc kubenswrapper[4693]: I1204 11:12:08.583502 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2018ccb-bfb0-4d17-b976-260ef56005c7-kube-api-access-295kj" (OuterVolumeSpecName: "kube-api-access-295kj") pod "b2018ccb-bfb0-4d17-b976-260ef56005c7" (UID: "b2018ccb-bfb0-4d17-b976-260ef56005c7"). InnerVolumeSpecName "kube-api-access-295kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:12:08 crc kubenswrapper[4693]: I1204 11:12:08.679496 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-295kj\" (UniqueName: \"kubernetes.io/projected/b2018ccb-bfb0-4d17-b976-260ef56005c7-kube-api-access-295kj\") on node \"crc\" DevicePath \"\"" Dec 04 11:12:09 crc kubenswrapper[4693]: I1204 11:12:09.142840 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zljj5/crc-debug-qqqx8"] Dec 04 11:12:09 crc kubenswrapper[4693]: I1204 11:12:09.157475 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zljj5/crc-debug-qqqx8"] Dec 04 11:12:09 crc kubenswrapper[4693]: I1204 11:12:09.390060 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a45f80f2f203e58d88eeb45d637dcd9896f1e94f93daf7a21a0779cbfc9f2c9" Dec 04 11:12:09 crc kubenswrapper[4693]: I1204 11:12:09.390455 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zljj5/crc-debug-qqqx8" Dec 04 11:12:10 crc kubenswrapper[4693]: I1204 11:12:10.318780 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zljj5/crc-debug-47gqs"] Dec 04 11:12:10 crc kubenswrapper[4693]: E1204 11:12:10.319643 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2018ccb-bfb0-4d17-b976-260ef56005c7" containerName="container-00" Dec 04 11:12:10 crc kubenswrapper[4693]: I1204 11:12:10.319679 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2018ccb-bfb0-4d17-b976-260ef56005c7" containerName="container-00" Dec 04 11:12:10 crc kubenswrapper[4693]: I1204 11:12:10.320091 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2018ccb-bfb0-4d17-b976-260ef56005c7" containerName="container-00" Dec 04 11:12:10 crc kubenswrapper[4693]: I1204 11:12:10.321431 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zljj5/crc-debug-47gqs" Dec 04 11:12:10 crc kubenswrapper[4693]: I1204 11:12:10.471646 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2018ccb-bfb0-4d17-b976-260ef56005c7" path="/var/lib/kubelet/pods/b2018ccb-bfb0-4d17-b976-260ef56005c7/volumes" Dec 04 11:12:10 crc kubenswrapper[4693]: I1204 11:12:10.513670 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81a68932-df18-486d-a483-84901d10c96e-host\") pod \"crc-debug-47gqs\" (UID: \"81a68932-df18-486d-a483-84901d10c96e\") " pod="openshift-must-gather-zljj5/crc-debug-47gqs" Dec 04 11:12:10 crc kubenswrapper[4693]: I1204 11:12:10.513777 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mctt\" (UniqueName: \"kubernetes.io/projected/81a68932-df18-486d-a483-84901d10c96e-kube-api-access-2mctt\") pod \"crc-debug-47gqs\" (UID: \"81a68932-df18-486d-a483-84901d10c96e\") " pod="openshift-must-gather-zljj5/crc-debug-47gqs" Dec 04 11:12:10 crc kubenswrapper[4693]: I1204 11:12:10.615229 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81a68932-df18-486d-a483-84901d10c96e-host\") pod \"crc-debug-47gqs\" (UID: \"81a68932-df18-486d-a483-84901d10c96e\") " pod="openshift-must-gather-zljj5/crc-debug-47gqs" Dec 04 11:12:10 crc kubenswrapper[4693]: I1204 11:12:10.615595 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mctt\" (UniqueName: \"kubernetes.io/projected/81a68932-df18-486d-a483-84901d10c96e-kube-api-access-2mctt\") pod \"crc-debug-47gqs\" (UID: \"81a68932-df18-486d-a483-84901d10c96e\") " pod="openshift-must-gather-zljj5/crc-debug-47gqs" Dec 04 11:12:10 crc kubenswrapper[4693]: I1204 11:12:10.615359 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81a68932-df18-486d-a483-84901d10c96e-host\") pod \"crc-debug-47gqs\" (UID: \"81a68932-df18-486d-a483-84901d10c96e\") " pod="openshift-must-gather-zljj5/crc-debug-47gqs" Dec 04 11:12:10 crc kubenswrapper[4693]: I1204 11:12:10.633379 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mctt\" (UniqueName: \"kubernetes.io/projected/81a68932-df18-486d-a483-84901d10c96e-kube-api-access-2mctt\") pod \"crc-debug-47gqs\" (UID: \"81a68932-df18-486d-a483-84901d10c96e\") " pod="openshift-must-gather-zljj5/crc-debug-47gqs" Dec 04 11:12:10 crc kubenswrapper[4693]: I1204 11:12:10.640800 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zljj5/crc-debug-47gqs" Dec 04 11:12:10 crc kubenswrapper[4693]: W1204 11:12:10.684819 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81a68932_df18_486d_a483_84901d10c96e.slice/crio-bff6ec2436773c49482b25f3967bed835603856e80cbb10edfc2ebb35ee324bc WatchSource:0}: Error finding container bff6ec2436773c49482b25f3967bed835603856e80cbb10edfc2ebb35ee324bc: Status 404 returned error can't find the container with id bff6ec2436773c49482b25f3967bed835603856e80cbb10edfc2ebb35ee324bc Dec 04 11:12:11 crc kubenswrapper[4693]: I1204 11:12:11.409924 4693 generic.go:334] "Generic (PLEG): container finished" podID="81a68932-df18-486d-a483-84901d10c96e" containerID="023ed5793780b4e989834f228d729a18bd825498df570c1e94efcc09098996f0" exitCode=0 Dec 04 11:12:11 crc kubenswrapper[4693]: I1204 11:12:11.410013 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zljj5/crc-debug-47gqs" event={"ID":"81a68932-df18-486d-a483-84901d10c96e","Type":"ContainerDied","Data":"023ed5793780b4e989834f228d729a18bd825498df570c1e94efcc09098996f0"} Dec 04 11:12:11 crc kubenswrapper[4693]: I1204 11:12:11.410245 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zljj5/crc-debug-47gqs" event={"ID":"81a68932-df18-486d-a483-84901d10c96e","Type":"ContainerStarted","Data":"bff6ec2436773c49482b25f3967bed835603856e80cbb10edfc2ebb35ee324bc"} Dec 04 11:12:11 crc kubenswrapper[4693]: I1204 11:12:11.447815 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zljj5/crc-debug-47gqs"] Dec 04 11:12:11 crc kubenswrapper[4693]: I1204 11:12:11.459771 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zljj5/crc-debug-47gqs"] Dec 04 11:12:12 crc kubenswrapper[4693]: I1204 11:12:12.546896 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zljj5/crc-debug-47gqs" Dec 04 11:12:12 crc kubenswrapper[4693]: I1204 11:12:12.657943 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81a68932-df18-486d-a483-84901d10c96e-host\") pod \"81a68932-df18-486d-a483-84901d10c96e\" (UID: \"81a68932-df18-486d-a483-84901d10c96e\") " Dec 04 11:12:12 crc kubenswrapper[4693]: I1204 11:12:12.658010 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mctt\" (UniqueName: \"kubernetes.io/projected/81a68932-df18-486d-a483-84901d10c96e-kube-api-access-2mctt\") pod \"81a68932-df18-486d-a483-84901d10c96e\" (UID: \"81a68932-df18-486d-a483-84901d10c96e\") " Dec 04 11:12:12 crc kubenswrapper[4693]: I1204 11:12:12.658061 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81a68932-df18-486d-a483-84901d10c96e-host" (OuterVolumeSpecName: "host") pod "81a68932-df18-486d-a483-84901d10c96e" (UID: "81a68932-df18-486d-a483-84901d10c96e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 11:12:12 crc kubenswrapper[4693]: I1204 11:12:12.659338 4693 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81a68932-df18-486d-a483-84901d10c96e-host\") on node \"crc\" DevicePath \"\"" Dec 04 11:12:12 crc kubenswrapper[4693]: I1204 11:12:12.664566 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81a68932-df18-486d-a483-84901d10c96e-kube-api-access-2mctt" (OuterVolumeSpecName: "kube-api-access-2mctt") pod "81a68932-df18-486d-a483-84901d10c96e" (UID: "81a68932-df18-486d-a483-84901d10c96e"). InnerVolumeSpecName "kube-api-access-2mctt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:12:12 crc kubenswrapper[4693]: I1204 11:12:12.760906 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mctt\" (UniqueName: \"kubernetes.io/projected/81a68932-df18-486d-a483-84901d10c96e-kube-api-access-2mctt\") on node \"crc\" DevicePath \"\"" Dec 04 11:12:13 crc kubenswrapper[4693]: I1204 11:12:13.427531 4693 scope.go:117] "RemoveContainer" containerID="023ed5793780b4e989834f228d729a18bd825498df570c1e94efcc09098996f0" Dec 04 11:12:13 crc kubenswrapper[4693]: I1204 11:12:13.427597 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zljj5/crc-debug-47gqs" Dec 04 11:12:14 crc kubenswrapper[4693]: I1204 11:12:14.472503 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81a68932-df18-486d-a483-84901d10c96e" path="/var/lib/kubelet/pods/81a68932-df18-486d-a483-84901d10c96e/volumes" Dec 04 11:12:21 crc kubenswrapper[4693]: I1204 11:12:21.462242 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:12:21 crc kubenswrapper[4693]: E1204 11:12:21.463406 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:12:27 crc kubenswrapper[4693]: I1204 11:12:27.633071 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5dcd5dc8-znmkm_c8aed54b-9500-4b6d-a966-64fb3cff7b45/barbican-api/0.log" Dec 04 11:12:27 crc kubenswrapper[4693]: I1204 11:12:27.811655 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5dcd5dc8-znmkm_c8aed54b-9500-4b6d-a966-64fb3cff7b45/barbican-api-log/0.log" Dec 04 11:12:27 crc kubenswrapper[4693]: I1204 11:12:27.850625 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76bdf94f96-jnvk8_33e00b6b-bd3b-4198-8333-1515f919cbfc/barbican-keystone-listener/0.log" Dec 04 11:12:28 crc kubenswrapper[4693]: I1204 11:12:28.038701 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-d7c5fcd89-hgn5p_ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340/barbican-worker/0.log" Dec 04 11:12:28 crc kubenswrapper[4693]: I1204 11:12:28.123516 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-d7c5fcd89-hgn5p_ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340/barbican-worker-log/0.log" Dec 04 11:12:28 crc kubenswrapper[4693]: I1204 11:12:28.427157 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv_bd60e9f3-ac52-4a2b-9e3b-80720e7634ab/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:12:28 crc kubenswrapper[4693]: I1204 11:12:28.534323 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76bdf94f96-jnvk8_33e00b6b-bd3b-4198-8333-1515f919cbfc/barbican-keystone-listener-log/0.log" Dec 04 11:12:28 crc kubenswrapper[4693]: I1204 11:12:28.636702 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3fcae802-3512-4246-bbe8-fc93ecb2505d/ceilometer-central-agent/0.log" Dec 04 11:12:28 crc kubenswrapper[4693]: I1204 11:12:28.663621 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3fcae802-3512-4246-bbe8-fc93ecb2505d/ceilometer-notification-agent/0.log" Dec 04 11:12:28 crc kubenswrapper[4693]: I1204 11:12:28.701098 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3fcae802-3512-4246-bbe8-fc93ecb2505d/proxy-httpd/0.log" Dec 04 11:12:28 crc kubenswrapper[4693]: I1204 11:12:28.758446 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3fcae802-3512-4246-bbe8-fc93ecb2505d/sg-core/0.log" Dec 04 11:12:29 crc kubenswrapper[4693]: I1204 11:12:29.071416 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph_5f0438dc-0fdb-48e2-a807-3292d8bb3fed/ceph/0.log" Dec 04 11:12:29 crc kubenswrapper[4693]: I1204 11:12:29.290115 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_bfcdc882-9b7b-4e42-877e-6e8be8597470/cinder-api/0.log" Dec 04 11:12:29 crc kubenswrapper[4693]: I1204 11:12:29.314246 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_bfcdc882-9b7b-4e42-877e-6e8be8597470/cinder-api-log/0.log" Dec 04 11:12:29 crc kubenswrapper[4693]: I1204 11:12:29.703153 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_ca5c459b-21a7-4799-a516-2a270de6e246/probe/0.log" Dec 04 11:12:29 crc kubenswrapper[4693]: I1204 11:12:29.767708 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_487df7df-e43a-48a6-8350-6b9804d13e39/cinder-scheduler/0.log" Dec 04 11:12:29 crc kubenswrapper[4693]: I1204 11:12:29.933792 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_ca5c459b-21a7-4799-a516-2a270de6e246/cinder-backup/0.log" Dec 04 11:12:29 crc kubenswrapper[4693]: I1204 11:12:29.987247 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_487df7df-e43a-48a6-8350-6b9804d13e39/probe/0.log" Dec 04 11:12:30 crc kubenswrapper[4693]: I1204 11:12:30.247785 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_711f27ff-01df-4851-bc37-a7115b5fa624/probe/0.log" Dec 04 11:12:30 crc kubenswrapper[4693]: I1204 11:12:30.277876 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-994x5_5aeee95a-198f-47ed-859b-0f710da9768c/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:12:30 crc kubenswrapper[4693]: I1204 11:12:30.537163 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-2p64q_b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:12:30 crc kubenswrapper[4693]: I1204 11:12:30.764490 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-84b5f8b59f-krd95_60e50add-23a4-48de-a35c-0275bab951b1/init/0.log" Dec 04 11:12:31 crc kubenswrapper[4693]: I1204 11:12:31.000481 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-84b5f8b59f-krd95_60e50add-23a4-48de-a35c-0275bab951b1/init/0.log" Dec 04 11:12:31 crc kubenswrapper[4693]: I1204 11:12:31.136379 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-84b5f8b59f-krd95_60e50add-23a4-48de-a35c-0275bab951b1/dnsmasq-dns/0.log" Dec 04 11:12:31 crc kubenswrapper[4693]: I1204 11:12:31.274662 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz_710e62b8-160d-49f9-8bdb-418a0ee9f379/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:12:31 crc kubenswrapper[4693]: I1204 11:12:31.408368 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_56bd6fe8-e97b-4c07-a204-ee44c09401b7/glance-httpd/0.log" Dec 04 11:12:31 crc kubenswrapper[4693]: I1204 11:12:31.537237 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_56bd6fe8-e97b-4c07-a204-ee44c09401b7/glance-log/0.log" Dec 04 11:12:31 crc kubenswrapper[4693]: I1204 11:12:31.658523 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e1d7ce8c-35a9-406c-9b7d-10e4976bb156/glance-log/0.log" Dec 04 11:12:31 crc kubenswrapper[4693]: I1204 11:12:31.661507 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e1d7ce8c-35a9-406c-9b7d-10e4976bb156/glance-httpd/0.log" Dec 04 11:12:31 crc kubenswrapper[4693]: I1204 11:12:31.998803 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l_8ba7c573-448f-438c-9999-ffe4e8c28f52/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:12:32 crc kubenswrapper[4693]: I1204 11:12:32.039818 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6f8cd9d6cb-vf5bx_ca2592b0-5dfd-4d15-996c-2340af86bd26/horizon/0.log" Dec 04 11:12:32 crc kubenswrapper[4693]: I1204 11:12:32.286229 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-cld2n_c8c97263-d34a-4231-9f52-5f0aae7163f2/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:12:32 crc kubenswrapper[4693]: I1204 11:12:32.566835 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29414101-snzll_e7a046b6-d7ff-46b3-a107-58e6e843bfff/keystone-cron/0.log" Dec 04 11:12:32 crc kubenswrapper[4693]: I1204 11:12:32.587317 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_711f27ff-01df-4851-bc37-a7115b5fa624/cinder-volume/0.log" Dec 04 11:12:32 crc kubenswrapper[4693]: I1204 11:12:32.737668 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6f8cd9d6cb-vf5bx_ca2592b0-5dfd-4d15-996c-2340af86bd26/horizon-log/0.log" Dec 04 11:12:32 crc kubenswrapper[4693]: I1204 11:12:32.793965 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4f9ccd88-ae2a-4026-b492-a09d29799c89/kube-state-metrics/0.log" Dec 04 11:12:33 crc kubenswrapper[4693]: I1204 11:12:33.054466 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p_a943a73c-465d-4a30-be17-967c79007a91/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:12:33 crc kubenswrapper[4693]: I1204 11:12:33.463214 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:12:33 crc kubenswrapper[4693]: E1204 11:12:33.463596 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:12:33 crc kubenswrapper[4693]: I1204 11:12:33.512460 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_8b6dc0f8-064a-4748-b69a-11713fe55088/manila-api/0.log" Dec 04 11:12:33 crc kubenswrapper[4693]: I1204 11:12:33.667783 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_5f40f194-33e3-4723-817f-981394e545b9/probe/0.log" Dec 04 11:12:33 crc kubenswrapper[4693]: I1204 11:12:33.924302 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_5f40f194-33e3-4723-817f-981394e545b9/manila-scheduler/0.log" Dec 04 11:12:33 crc kubenswrapper[4693]: I1204 11:12:33.944756 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_33b8b8b4-d56e-4c4c-9e87-95d334534e74/probe/0.log" Dec 04 11:12:34 crc kubenswrapper[4693]: I1204 11:12:34.207852 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_8b6dc0f8-064a-4748-b69a-11713fe55088/manila-api-log/0.log" Dec 04 11:12:34 crc kubenswrapper[4693]: I1204 11:12:34.253187 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_33b8b8b4-d56e-4c4c-9e87-95d334534e74/manila-share/0.log" Dec 04 11:12:34 crc kubenswrapper[4693]: I1204 11:12:34.866148 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s_cbd12578-e1a3-41b0-95be-2162e189daae/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:12:35 crc kubenswrapper[4693]: I1204 11:12:35.141693 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-77d49c9649-fpwft_c2c81aab-5f08-429a-941e-9890ef46273e/neutron-httpd/0.log" Dec 04 11:12:35 crc kubenswrapper[4693]: I1204 11:12:35.729151 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-77d49c9649-fpwft_c2c81aab-5f08-429a-941e-9890ef46273e/neutron-api/0.log" Dec 04 11:12:36 crc kubenswrapper[4693]: I1204 11:12:36.696470 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_2ea5071f-1037-494c-b12f-ebddb5deb122/nova-cell0-conductor-conductor/0.log" Dec 04 11:12:36 crc kubenswrapper[4693]: I1204 11:12:36.774754 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7f774bdc67-hjxts_fc3b6747-ed65-46ef-8034-e35edf80ac90/keystone-api/0.log" Dec 04 11:12:37 crc kubenswrapper[4693]: I1204 11:12:37.242812 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_7cece472-f359-4ce8-b1f8-17ca920f4b3d/nova-cell1-conductor-conductor/0.log" Dec 04 11:12:37 crc kubenswrapper[4693]: I1204 11:12:37.688987 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_bba2b0cd-4556-4a03-a111-d73471571173/nova-cell1-novncproxy-novncproxy/0.log" Dec 04 11:12:37 crc kubenswrapper[4693]: I1204 11:12:37.917019 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d658460b-438a-46f0-88e1-136741999c81/nova-api-log/0.log" Dec 04 11:12:37 crc kubenswrapper[4693]: I1204 11:12:37.999410 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-jj8z6_50c8246f-670e-4056-9c35-19e8042a96bf/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:12:38 crc kubenswrapper[4693]: I1204 11:12:38.254921 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bc70aae2-116d-4528-8a5a-efab89d7e53b/nova-metadata-log/0.log" Dec 04 11:12:38 crc kubenswrapper[4693]: I1204 11:12:38.744440 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b85d3f3e-5811-4829-8b36-96ecb7f22492/mysql-bootstrap/0.log" Dec 04 11:12:38 crc kubenswrapper[4693]: I1204 11:12:38.756826 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d658460b-438a-46f0-88e1-136741999c81/nova-api-api/0.log" Dec 04 11:12:38 crc kubenswrapper[4693]: I1204 11:12:38.973267 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b85d3f3e-5811-4829-8b36-96ecb7f22492/galera/0.log" Dec 04 11:12:38 crc kubenswrapper[4693]: I1204 11:12:38.978380 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b85d3f3e-5811-4829-8b36-96ecb7f22492/mysql-bootstrap/0.log" Dec 04 11:12:38 crc kubenswrapper[4693]: I1204 11:12:38.980433 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ca70cccf-b92e-4997-9ca9-1375a2cceca1/nova-scheduler-scheduler/0.log" Dec 04 11:12:39 crc kubenswrapper[4693]: I1204 11:12:39.228562 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e55c8437-1394-45c9-b135-2dbe68895d38/mysql-bootstrap/0.log" Dec 04 11:12:39 crc kubenswrapper[4693]: I1204 11:12:39.396167 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e55c8437-1394-45c9-b135-2dbe68895d38/mysql-bootstrap/0.log" Dec 04 11:12:39 crc kubenswrapper[4693]: I1204 11:12:39.509074 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e55c8437-1394-45c9-b135-2dbe68895d38/galera/0.log" Dec 04 11:12:39 crc kubenswrapper[4693]: I1204 11:12:39.655740 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b2fefcbd-5f7f-4544-8f03-49adbe23a11b/openstackclient/0.log" Dec 04 11:12:39 crc kubenswrapper[4693]: I1204 11:12:39.780951 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-clqww_e8b66ffe-c672-438c-ab15-a4a44563152d/openstack-network-exporter/0.log" Dec 04 11:12:39 crc kubenswrapper[4693]: I1204 11:12:39.996297 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7nh5c_cd544a1e-c7e1-4f04-90f5-d9cf152c4f12/ovsdb-server-init/0.log" Dec 04 11:12:40 crc kubenswrapper[4693]: I1204 11:12:40.231715 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7nh5c_cd544a1e-c7e1-4f04-90f5-d9cf152c4f12/ovsdb-server-init/0.log" Dec 04 11:12:40 crc kubenswrapper[4693]: I1204 11:12:40.341390 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7nh5c_cd544a1e-c7e1-4f04-90f5-d9cf152c4f12/ovs-vswitchd/0.log" Dec 04 11:12:40 crc kubenswrapper[4693]: I1204 11:12:40.360032 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7nh5c_cd544a1e-c7e1-4f04-90f5-d9cf152c4f12/ovsdb-server/0.log" Dec 04 11:12:40 crc kubenswrapper[4693]: I1204 11:12:40.410140 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bc70aae2-116d-4528-8a5a-efab89d7e53b/nova-metadata-metadata/0.log" Dec 04 11:12:40 crc kubenswrapper[4693]: I1204 11:12:40.586633 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-zczb4_ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a/ovn-controller/0.log" Dec 04 11:12:40 crc kubenswrapper[4693]: I1204 11:12:40.723051 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-wkvzh_ab7d9721-bbc8-489e-96de-98ce148725de/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:12:40 crc kubenswrapper[4693]: I1204 11:12:40.784055 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_eec5c741-f1c6-424f-b3e1-4f5219fa0bf0/openstack-network-exporter/0.log" Dec 04 11:12:40 crc kubenswrapper[4693]: I1204 11:12:40.870257 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_eec5c741-f1c6-424f-b3e1-4f5219fa0bf0/ovn-northd/0.log" Dec 04 11:12:41 crc kubenswrapper[4693]: I1204 11:12:41.024318 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_14639c36-341c-4f90-980b-b9fffce3c8f8/openstack-network-exporter/0.log" Dec 04 11:12:41 crc kubenswrapper[4693]: I1204 11:12:41.101563 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_14639c36-341c-4f90-980b-b9fffce3c8f8/ovsdbserver-nb/0.log" Dec 04 11:12:41 crc kubenswrapper[4693]: I1204 11:12:41.226553 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9999b422-d127-4990-8091-9446e589839a/openstack-network-exporter/0.log" Dec 04 11:12:41 crc kubenswrapper[4693]: I1204 11:12:41.260734 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9999b422-d127-4990-8091-9446e589839a/ovsdbserver-sb/0.log" Dec 04 11:12:41 crc kubenswrapper[4693]: I1204 11:12:41.662161 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c1ee328e-d29f-4224-913b-bc23195bf2b2/setup-container/0.log" Dec 04 11:12:41 crc kubenswrapper[4693]: I1204 11:12:41.799790 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-596dc75986-wjgrk_7e6c8844-46bb-47e2-99d2-a9da861757e7/placement-api/0.log" Dec 04 11:12:41 crc kubenswrapper[4693]: I1204 11:12:41.913972 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c1ee328e-d29f-4224-913b-bc23195bf2b2/setup-container/0.log" Dec 04 11:12:41 crc kubenswrapper[4693]: I1204 11:12:41.961930 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c1ee328e-d29f-4224-913b-bc23195bf2b2/rabbitmq/0.log" Dec 04 11:12:42 crc kubenswrapper[4693]: I1204 11:12:42.012145 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-596dc75986-wjgrk_7e6c8844-46bb-47e2-99d2-a9da861757e7/placement-log/0.log" Dec 04 11:12:42 crc kubenswrapper[4693]: I1204 11:12:42.094822 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c0a72230-d599-4df6-bd4b-279092bf8861/setup-container/0.log" Dec 04 11:12:42 crc kubenswrapper[4693]: I1204 11:12:42.379926 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5_db089c88-39e5-4e8f-93b8-b02a59f50b93/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:12:42 crc kubenswrapper[4693]: I1204 11:12:42.412414 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c0a72230-d599-4df6-bd4b-279092bf8861/setup-container/0.log" Dec 04 11:12:42 crc kubenswrapper[4693]: I1204 11:12:42.424630 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c0a72230-d599-4df6-bd4b-279092bf8861/rabbitmq/0.log" Dec 04 11:12:42 crc kubenswrapper[4693]: I1204 11:12:42.697691 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5_21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:12:42 crc kubenswrapper[4693]: I1204 11:12:42.701062 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-ksdq9_80be6b5e-e208-4c31-a663-4a01f460ea18/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:12:43 crc kubenswrapper[4693]: I1204 11:12:43.090713 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-tkkcn_ff072e20-bb88-4bc8-8e07-a12912774161/ssh-known-hosts-edpm-deployment/0.log" Dec 04 11:12:43 crc kubenswrapper[4693]: I1204 11:12:43.187039 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-5wn5w_a57888c7-06f9-478c-9e80-3c028cabcb28/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:12:43 crc kubenswrapper[4693]: I1204 11:12:43.407885 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-57857fb86f-8m84s_3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a/proxy-server/0.log" Dec 04 11:12:43 crc kubenswrapper[4693]: I1204 11:12:43.408686 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-xz7js_39170f53-93c9-49fd-8dba-42d325269e74/swift-ring-rebalance/0.log" Dec 04 11:12:43 crc kubenswrapper[4693]: I1204 11:12:43.414621 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-57857fb86f-8m84s_3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a/proxy-httpd/0.log" Dec 04 11:12:43 crc kubenswrapper[4693]: I1204 11:12:43.773382 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/account-reaper/0.log" Dec 04 11:12:43 crc kubenswrapper[4693]: I1204 11:12:43.807617 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/account-auditor/0.log" Dec 04 11:12:43 crc kubenswrapper[4693]: I1204 11:12:43.958715 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/account-replicator/0.log" Dec 04 11:12:44 crc kubenswrapper[4693]: I1204 11:12:44.019111 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/container-auditor/0.log" Dec 04 11:12:44 crc kubenswrapper[4693]: I1204 11:12:44.042686 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/account-server/0.log" Dec 04 11:12:44 crc kubenswrapper[4693]: I1204 11:12:44.106469 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/container-replicator/0.log" Dec 04 11:12:44 crc kubenswrapper[4693]: I1204 11:12:44.140183 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/container-server/0.log" Dec 04 11:12:44 crc kubenswrapper[4693]: I1204 11:12:44.252948 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/container-updater/0.log" Dec 04 11:12:44 crc kubenswrapper[4693]: I1204 11:12:44.330965 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/object-auditor/0.log" Dec 04 11:12:44 crc kubenswrapper[4693]: I1204 11:12:44.376289 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/object-expirer/0.log" Dec 04 11:12:44 crc kubenswrapper[4693]: I1204 11:12:44.406009 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/object-replicator/0.log" Dec 04 11:12:44 crc kubenswrapper[4693]: I1204 11:12:44.481120 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/object-server/0.log" Dec 04 11:12:44 crc kubenswrapper[4693]: I1204 11:12:44.581626 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/object-updater/0.log" Dec 04 11:12:44 crc kubenswrapper[4693]: I1204 11:12:44.596810 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/rsync/0.log" Dec 04 11:12:44 crc kubenswrapper[4693]: I1204 11:12:44.666514 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/swift-recon-cron/0.log" Dec 04 11:12:44 crc kubenswrapper[4693]: I1204 11:12:44.870830 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5_ba3e6d3f-0285-4742-80ba-4c15da05164c/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:12:45 crc kubenswrapper[4693]: I1204 11:12:45.047861 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_da1a36ac-5a8b-475d-8434-eb43b0f8a656/tempest-tests-tempest-tests-runner/0.log" Dec 04 11:12:45 crc kubenswrapper[4693]: I1204 11:12:45.119355 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0e4ff807-907a-4310-97c4-7e60e55dcaca/test-operator-logs-container/0.log" Dec 04 11:12:45 crc kubenswrapper[4693]: I1204 11:12:45.283958 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn_bbf583ab-d797-4781-a13e-d4d493483d3e/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:12:47 crc kubenswrapper[4693]: I1204 11:12:47.461916 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:12:47 crc kubenswrapper[4693]: E1204 11:12:47.462686 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:12:58 crc kubenswrapper[4693]: I1204 11:12:58.518467 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f304446b-e129-40a5-bc56-a79d0b973f0a/memcached/0.log" Dec 04 11:13:00 crc kubenswrapper[4693]: I1204 11:13:00.461859 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:13:00 crc kubenswrapper[4693]: E1204 11:13:00.462449 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:13:10 crc kubenswrapper[4693]: I1204 11:13:10.898021 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4_53398369-8904-43ae-be7b-ced663828e5e/util/0.log" Dec 04 11:13:11 crc kubenswrapper[4693]: I1204 11:13:11.084452 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4_53398369-8904-43ae-be7b-ced663828e5e/util/0.log" Dec 04 11:13:11 crc kubenswrapper[4693]: I1204 11:13:11.087738 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4_53398369-8904-43ae-be7b-ced663828e5e/pull/0.log" Dec 04 11:13:11 crc kubenswrapper[4693]: I1204 11:13:11.123358 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4_53398369-8904-43ae-be7b-ced663828e5e/pull/0.log" Dec 04 11:13:11 crc kubenswrapper[4693]: I1204 11:13:11.294869 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4_53398369-8904-43ae-be7b-ced663828e5e/extract/0.log" Dec 04 11:13:11 crc kubenswrapper[4693]: I1204 11:13:11.313801 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4_53398369-8904-43ae-be7b-ced663828e5e/pull/0.log" Dec 04 11:13:11 crc kubenswrapper[4693]: I1204 11:13:11.317158 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4_53398369-8904-43ae-be7b-ced663828e5e/util/0.log" Dec 04 11:13:11 crc kubenswrapper[4693]: I1204 11:13:11.507726 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-vtgdw_5aa92828-abd4-4f89-9621-5e9830101fca/kube-rbac-proxy/0.log" Dec 04 11:13:11 crc kubenswrapper[4693]: I1204 11:13:11.566303 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-5j4cs_f3a27983-d919-48fb-a227-f6a45efef985/kube-rbac-proxy/0.log" Dec 04 11:13:11 crc kubenswrapper[4693]: I1204 11:13:11.605853 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-vtgdw_5aa92828-abd4-4f89-9621-5e9830101fca/manager/0.log" Dec 04 11:13:11 crc kubenswrapper[4693]: I1204 11:13:11.722075 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-5j4cs_f3a27983-d919-48fb-a227-f6a45efef985/manager/0.log" Dec 04 11:13:11 crc kubenswrapper[4693]: I1204 11:13:11.748000 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-q8crl_2bb25289-630f-46c3-96f0-b5ea8177f5d8/kube-rbac-proxy/0.log" Dec 04 11:13:11 crc kubenswrapper[4693]: I1204 11:13:11.795869 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-q8crl_2bb25289-630f-46c3-96f0-b5ea8177f5d8/manager/0.log" Dec 04 11:13:11 crc kubenswrapper[4693]: I1204 11:13:11.918490 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-tnzrv_7fb21378-fa3f-41a2-a6da-80831acec23c/kube-rbac-proxy/0.log" Dec 04 11:13:12 crc kubenswrapper[4693]: I1204 11:13:12.050597 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-tnzrv_7fb21378-fa3f-41a2-a6da-80831acec23c/manager/0.log" Dec 04 11:13:12 crc kubenswrapper[4693]: I1204 11:13:12.123657 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-jgmqz_4ba18ef1-50c1-48d0-9d2e-3c83c65913ab/kube-rbac-proxy/0.log" Dec 04 11:13:12 crc kubenswrapper[4693]: I1204 11:13:12.124402 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-jgmqz_4ba18ef1-50c1-48d0-9d2e-3c83c65913ab/manager/0.log" Dec 04 11:13:12 crc kubenswrapper[4693]: I1204 11:13:12.259106 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-4gqg9_79466fca-aa64-407e-9488-d89e43d4bed9/kube-rbac-proxy/0.log" Dec 04 11:13:12 crc kubenswrapper[4693]: I1204 11:13:12.331136 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-4gqg9_79466fca-aa64-407e-9488-d89e43d4bed9/manager/0.log" Dec 04 11:13:12 crc kubenswrapper[4693]: I1204 11:13:12.386454 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-46wh4_b7bce599-dd9d-43c5-b5a9-53a081b6f183/kube-rbac-proxy/0.log" Dec 04 11:13:12 crc kubenswrapper[4693]: I1204 11:13:12.541611 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-qcc4g_37983465-c081-4645-9a0b-47431d284dbe/kube-rbac-proxy/0.log" Dec 04 11:13:12 crc kubenswrapper[4693]: I1204 11:13:12.622925 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-qcc4g_37983465-c081-4645-9a0b-47431d284dbe/manager/0.log" Dec 04 11:13:12 crc kubenswrapper[4693]: I1204 11:13:12.635821 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-46wh4_b7bce599-dd9d-43c5-b5a9-53a081b6f183/manager/0.log" Dec 04 11:13:12 crc kubenswrapper[4693]: I1204 11:13:12.769063 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-gtwgq_65a2270f-58bd-486b-9be3-c85fee980070/kube-rbac-proxy/0.log" Dec 04 11:13:12 crc kubenswrapper[4693]: I1204 11:13:12.873243 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-gtwgq_65a2270f-58bd-486b-9be3-c85fee980070/manager/0.log" Dec 04 11:13:13 crc kubenswrapper[4693]: I1204 11:13:13.003432 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-4mp89_1de6adcf-e847-4a10-af8c-683f83c32551/kube-rbac-proxy/0.log" Dec 04 11:13:13 crc kubenswrapper[4693]: I1204 11:13:13.049103 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-4mp89_1de6adcf-e847-4a10-af8c-683f83c32551/manager/0.log" Dec 04 11:13:13 crc kubenswrapper[4693]: I1204 11:13:13.108999 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-np6kl_ae731a83-bab7-4843-b413-e8b03a3ca1c3/kube-rbac-proxy/0.log" Dec 04 11:13:13 crc kubenswrapper[4693]: I1204 11:13:13.228648 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-np6kl_ae731a83-bab7-4843-b413-e8b03a3ca1c3/manager/0.log" Dec 04 11:13:13 crc kubenswrapper[4693]: I1204 11:13:13.338035 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-w7zh6_92ac4c28-9d59-4955-b5cf-ae45e97fdeed/kube-rbac-proxy/0.log" Dec 04 11:13:13 crc kubenswrapper[4693]: I1204 11:13:13.379145 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-w7zh6_92ac4c28-9d59-4955-b5cf-ae45e97fdeed/manager/0.log" Dec 04 11:13:13 crc kubenswrapper[4693]: I1204 11:13:13.451713 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-klr7c_fc97bab3-20bf-4931-868d-a20ad433cc81/kube-rbac-proxy/0.log" Dec 04 11:13:13 crc kubenswrapper[4693]: I1204 11:13:13.676369 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-zg6gn_9b4ce9a3-bc13-4726-af72-c0f4c619efec/kube-rbac-proxy/0.log" Dec 04 11:13:13 crc kubenswrapper[4693]: I1204 11:13:13.676952 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-zg6gn_9b4ce9a3-bc13-4726-af72-c0f4c619efec/manager/0.log" Dec 04 11:13:13 crc kubenswrapper[4693]: I1204 11:13:13.724420 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-klr7c_fc97bab3-20bf-4931-868d-a20ad433cc81/manager/0.log" Dec 04 11:13:13 crc kubenswrapper[4693]: I1204 11:13:13.887896 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7_287b0c68-a203-4af6-b654-2eb97b004cdc/kube-rbac-proxy/0.log" Dec 04 11:13:13 crc kubenswrapper[4693]: I1204 11:13:13.895495 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7_287b0c68-a203-4af6-b654-2eb97b004cdc/manager/0.log" Dec 04 11:13:14 crc kubenswrapper[4693]: I1204 11:13:14.216493 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5959575f68-6dnpv_4c440aca-49fc-4f5b-8890-d2b8c021febf/operator/0.log" Dec 04 11:13:14 crc kubenswrapper[4693]: I1204 11:13:14.321643 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-shhvj_f908e4a3-f7a8-4165-b9b1-8b25f43727e1/registry-server/0.log" Dec 04 11:13:14 crc kubenswrapper[4693]: I1204 11:13:14.514438 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-cxrdm_b352e856-6946-41ed-8d06-46b1ab00185e/kube-rbac-proxy/0.log" Dec 04 11:13:14 crc kubenswrapper[4693]: I1204 11:13:14.601747 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-cxrdm_b352e856-6946-41ed-8d06-46b1ab00185e/manager/0.log" Dec 04 11:13:14 crc kubenswrapper[4693]: I1204 11:13:14.746815 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-4zbtc_75a5a37b-eb32-4654-85f7-1c7b9de1c247/manager/0.log" Dec 04 11:13:14 crc kubenswrapper[4693]: I1204 11:13:14.752893 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-4zbtc_75a5a37b-eb32-4654-85f7-1c7b9de1c247/kube-rbac-proxy/0.log" Dec 04 11:13:14 crc kubenswrapper[4693]: I1204 11:13:14.994129 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xzz9b_4ffbc9ab-625c-467a-b3cb-017b4167d8a1/operator/0.log" Dec 04 11:13:15 crc kubenswrapper[4693]: I1204 11:13:15.048530 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-gfvq4_23baa4a2-ca26-41a5-968c-f642ca80d1fa/kube-rbac-proxy/0.log" Dec 04 11:13:15 crc kubenswrapper[4693]: I1204 11:13:15.112542 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-gfvq4_23baa4a2-ca26-41a5-968c-f642ca80d1fa/manager/0.log" Dec 04 11:13:15 crc kubenswrapper[4693]: I1204 11:13:15.293508 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-zmpkb_a37bfc80-1ecc-4547-8fbe-be223b9a5cc2/kube-rbac-proxy/0.log" Dec 04 11:13:15 crc kubenswrapper[4693]: I1204 11:13:15.365426 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-zmpkb_a37bfc80-1ecc-4547-8fbe-be223b9a5cc2/manager/0.log" Dec 04 11:13:15 crc kubenswrapper[4693]: I1204 11:13:15.384652 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5bf9d46bf4-jn6kv_9b4532d5-fce3-43a3-b72c-c0752eae7945/manager/0.log" Dec 04 11:13:15 crc kubenswrapper[4693]: I1204 11:13:15.463004 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:13:15 crc kubenswrapper[4693]: E1204 11:13:15.463408 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:13:15 crc kubenswrapper[4693]: I1204 11:13:15.479902 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-vfrzv_b2d582c6-b444-4591-93c3-7681714732bc/kube-rbac-proxy/0.log" Dec 04 11:13:15 crc kubenswrapper[4693]: I1204 11:13:15.525156 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-vfrzv_b2d582c6-b444-4591-93c3-7681714732bc/manager/0.log" Dec 04 11:13:15 crc kubenswrapper[4693]: I1204 11:13:15.565540 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-pzpj7_383c7650-d095-4996-88c6-06d999b1973b/kube-rbac-proxy/0.log" Dec 04 11:13:15 crc kubenswrapper[4693]: I1204 11:13:15.620185 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-pzpj7_383c7650-d095-4996-88c6-06d999b1973b/manager/0.log" Dec 04 11:13:29 crc kubenswrapper[4693]: I1204 11:13:29.461621 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:13:29 crc kubenswrapper[4693]: E1204 11:13:29.462718 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:13:29 crc kubenswrapper[4693]: I1204 11:13:29.671475 4693 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-zctpg container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 11:13:29 crc kubenswrapper[4693]: I1204 11:13:29.671542 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zctpg" podUID="1c46aadf-ba22-4bdc-b76a-8b9ad8880368" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 11:13:33 crc kubenswrapper[4693]: I1204 11:13:33.380749 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-n9k5f_3e92c71e-1bcc-455f-a270-1dd051662af6/control-plane-machine-set-operator/0.log" Dec 04 11:13:33 crc kubenswrapper[4693]: I1204 11:13:33.527929 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gh7dl_46a329a4-a450-4e39-bcbe-c7dcba1e6939/kube-rbac-proxy/0.log" Dec 04 11:13:35 crc kubenswrapper[4693]: I1204 11:13:35.556584 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gh7dl_46a329a4-a450-4e39-bcbe-c7dcba1e6939/machine-api-operator/0.log" Dec 04 11:13:41 crc kubenswrapper[4693]: I1204 11:13:41.461598 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:13:41 crc kubenswrapper[4693]: E1204 11:13:41.462438 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:13:45 crc kubenswrapper[4693]: I1204 11:13:45.919524 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-qc5kn_e8fb930d-7df9-4d8f-8edf-e5ae3ef734ce/cert-manager-controller/0.log" Dec 04 11:13:46 crc kubenswrapper[4693]: I1204 11:13:46.083304 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-qn5sx_5f2c58ea-f0fb-4460-9794-64d3182b3b5f/cert-manager-cainjector/0.log" Dec 04 11:13:46 crc kubenswrapper[4693]: I1204 11:13:46.142596 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-9zv9m_df9e8e44-fcb6-48e4-abc7-cb16efbb64bd/cert-manager-webhook/0.log" Dec 04 11:13:53 crc kubenswrapper[4693]: I1204 11:13:53.462103 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:13:53 crc kubenswrapper[4693]: E1204 11:13:53.462877 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:13:58 crc kubenswrapper[4693]: I1204 11:13:58.587026 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-6tjzs_04395eb3-d17a-45e1-8c76-5cef70217095/nmstate-console-plugin/0.log" Dec 04 11:13:58 crc kubenswrapper[4693]: I1204 11:13:58.779940 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-897g8_3ce4524c-acc8-42ff-b575-712649f91f33/nmstate-handler/0.log" Dec 04 11:13:58 crc kubenswrapper[4693]: I1204 11:13:58.878131 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-qvq5w_5054f7b0-a486-497e-9514-4a1387e7f815/kube-rbac-proxy/0.log" Dec 04 11:13:59 crc kubenswrapper[4693]: I1204 11:13:59.005107 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-qvq5w_5054f7b0-a486-497e-9514-4a1387e7f815/nmstate-metrics/0.log" Dec 04 11:13:59 crc kubenswrapper[4693]: I1204 11:13:59.035009 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-6q6qg_92bcc38e-785b-41c0-9bf0-60db671cf71c/nmstate-operator/0.log" Dec 04 11:13:59 crc kubenswrapper[4693]: I1204 11:13:59.194605 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-w2kzc_a3bf9b17-2a64-4697-b916-16b8c14f4bff/nmstate-webhook/0.log" Dec 04 11:14:06 crc kubenswrapper[4693]: I1204 11:14:06.461140 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:14:06 crc kubenswrapper[4693]: E1204 11:14:06.461816 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:14:12 crc kubenswrapper[4693]: I1204 11:14:12.551387 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wpz8c"] Dec 04 11:14:12 crc kubenswrapper[4693]: E1204 11:14:12.552302 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81a68932-df18-486d-a483-84901d10c96e" containerName="container-00" Dec 04 11:14:12 crc kubenswrapper[4693]: I1204 11:14:12.552316 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="81a68932-df18-486d-a483-84901d10c96e" containerName="container-00" Dec 04 11:14:12 crc kubenswrapper[4693]: I1204 11:14:12.552529 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="81a68932-df18-486d-a483-84901d10c96e" containerName="container-00" Dec 04 11:14:12 crc kubenswrapper[4693]: I1204 11:14:12.558844 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wpz8c" Dec 04 11:14:12 crc kubenswrapper[4693]: I1204 11:14:12.568443 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wpz8c"] Dec 04 11:14:12 crc kubenswrapper[4693]: I1204 11:14:12.641947 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48c2dd0e-da3a-40fa-b210-063c2c490fda-utilities\") pod \"redhat-operators-wpz8c\" (UID: \"48c2dd0e-da3a-40fa-b210-063c2c490fda\") " pod="openshift-marketplace/redhat-operators-wpz8c" Dec 04 11:14:12 crc kubenswrapper[4693]: I1204 11:14:12.642153 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48c2dd0e-da3a-40fa-b210-063c2c490fda-catalog-content\") pod \"redhat-operators-wpz8c\" (UID: \"48c2dd0e-da3a-40fa-b210-063c2c490fda\") " pod="openshift-marketplace/redhat-operators-wpz8c" Dec 04 11:14:12 crc kubenswrapper[4693]: I1204 11:14:12.642255 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfjh7\" (UniqueName: \"kubernetes.io/projected/48c2dd0e-da3a-40fa-b210-063c2c490fda-kube-api-access-pfjh7\") pod \"redhat-operators-wpz8c\" (UID: \"48c2dd0e-da3a-40fa-b210-063c2c490fda\") " pod="openshift-marketplace/redhat-operators-wpz8c" Dec 04 11:14:12 crc kubenswrapper[4693]: I1204 11:14:12.744314 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48c2dd0e-da3a-40fa-b210-063c2c490fda-catalog-content\") pod \"redhat-operators-wpz8c\" (UID: \"48c2dd0e-da3a-40fa-b210-063c2c490fda\") " pod="openshift-marketplace/redhat-operators-wpz8c" Dec 04 11:14:12 crc kubenswrapper[4693]: I1204 11:14:12.744457 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfjh7\" (UniqueName: \"kubernetes.io/projected/48c2dd0e-da3a-40fa-b210-063c2c490fda-kube-api-access-pfjh7\") pod \"redhat-operators-wpz8c\" (UID: \"48c2dd0e-da3a-40fa-b210-063c2c490fda\") " pod="openshift-marketplace/redhat-operators-wpz8c" Dec 04 11:14:12 crc kubenswrapper[4693]: I1204 11:14:12.744516 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48c2dd0e-da3a-40fa-b210-063c2c490fda-utilities\") pod \"redhat-operators-wpz8c\" (UID: \"48c2dd0e-da3a-40fa-b210-063c2c490fda\") " pod="openshift-marketplace/redhat-operators-wpz8c" Dec 04 11:14:12 crc kubenswrapper[4693]: I1204 11:14:12.744752 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48c2dd0e-da3a-40fa-b210-063c2c490fda-catalog-content\") pod \"redhat-operators-wpz8c\" (UID: \"48c2dd0e-da3a-40fa-b210-063c2c490fda\") " pod="openshift-marketplace/redhat-operators-wpz8c" Dec 04 11:14:12 crc kubenswrapper[4693]: I1204 11:14:12.744993 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48c2dd0e-da3a-40fa-b210-063c2c490fda-utilities\") pod \"redhat-operators-wpz8c\" (UID: \"48c2dd0e-da3a-40fa-b210-063c2c490fda\") " pod="openshift-marketplace/redhat-operators-wpz8c" Dec 04 11:14:12 crc kubenswrapper[4693]: I1204 11:14:12.763988 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfjh7\" (UniqueName: \"kubernetes.io/projected/48c2dd0e-da3a-40fa-b210-063c2c490fda-kube-api-access-pfjh7\") pod \"redhat-operators-wpz8c\" (UID: \"48c2dd0e-da3a-40fa-b210-063c2c490fda\") " pod="openshift-marketplace/redhat-operators-wpz8c" Dec 04 11:14:12 crc kubenswrapper[4693]: I1204 11:14:12.896133 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wpz8c" Dec 04 11:14:13 crc kubenswrapper[4693]: I1204 11:14:13.475654 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wpz8c"] Dec 04 11:14:13 crc kubenswrapper[4693]: I1204 11:14:13.557433 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bbbjp"] Dec 04 11:14:13 crc kubenswrapper[4693]: I1204 11:14:13.582713 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bbbjp"] Dec 04 11:14:13 crc kubenswrapper[4693]: I1204 11:14:13.582839 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbbjp" Dec 04 11:14:13 crc kubenswrapper[4693]: I1204 11:14:13.665774 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d686s\" (UniqueName: \"kubernetes.io/projected/64a545b9-f2e5-4114-b554-7259dbe4d0a2-kube-api-access-d686s\") pod \"community-operators-bbbjp\" (UID: \"64a545b9-f2e5-4114-b554-7259dbe4d0a2\") " pod="openshift-marketplace/community-operators-bbbjp" Dec 04 11:14:13 crc kubenswrapper[4693]: I1204 11:14:13.665853 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64a545b9-f2e5-4114-b554-7259dbe4d0a2-utilities\") pod \"community-operators-bbbjp\" (UID: \"64a545b9-f2e5-4114-b554-7259dbe4d0a2\") " pod="openshift-marketplace/community-operators-bbbjp" Dec 04 11:14:13 crc kubenswrapper[4693]: I1204 11:14:13.665971 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64a545b9-f2e5-4114-b554-7259dbe4d0a2-catalog-content\") pod \"community-operators-bbbjp\" (UID: \"64a545b9-f2e5-4114-b554-7259dbe4d0a2\") " pod="openshift-marketplace/community-operators-bbbjp" Dec 04 11:14:13 crc kubenswrapper[4693]: I1204 11:14:13.767631 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64a545b9-f2e5-4114-b554-7259dbe4d0a2-utilities\") pod \"community-operators-bbbjp\" (UID: \"64a545b9-f2e5-4114-b554-7259dbe4d0a2\") " pod="openshift-marketplace/community-operators-bbbjp" Dec 04 11:14:13 crc kubenswrapper[4693]: I1204 11:14:13.767772 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64a545b9-f2e5-4114-b554-7259dbe4d0a2-catalog-content\") pod \"community-operators-bbbjp\" (UID: \"64a545b9-f2e5-4114-b554-7259dbe4d0a2\") " pod="openshift-marketplace/community-operators-bbbjp" Dec 04 11:14:13 crc kubenswrapper[4693]: I1204 11:14:13.767866 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d686s\" (UniqueName: \"kubernetes.io/projected/64a545b9-f2e5-4114-b554-7259dbe4d0a2-kube-api-access-d686s\") pod \"community-operators-bbbjp\" (UID: \"64a545b9-f2e5-4114-b554-7259dbe4d0a2\") " pod="openshift-marketplace/community-operators-bbbjp" Dec 04 11:14:13 crc kubenswrapper[4693]: I1204 11:14:13.768172 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64a545b9-f2e5-4114-b554-7259dbe4d0a2-utilities\") pod \"community-operators-bbbjp\" (UID: \"64a545b9-f2e5-4114-b554-7259dbe4d0a2\") " pod="openshift-marketplace/community-operators-bbbjp" Dec 04 11:14:13 crc kubenswrapper[4693]: I1204 11:14:13.768259 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64a545b9-f2e5-4114-b554-7259dbe4d0a2-catalog-content\") pod \"community-operators-bbbjp\" (UID: \"64a545b9-f2e5-4114-b554-7259dbe4d0a2\") " pod="openshift-marketplace/community-operators-bbbjp" Dec 04 11:14:13 crc kubenswrapper[4693]: I1204 11:14:13.794629 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d686s\" (UniqueName: \"kubernetes.io/projected/64a545b9-f2e5-4114-b554-7259dbe4d0a2-kube-api-access-d686s\") pod \"community-operators-bbbjp\" (UID: \"64a545b9-f2e5-4114-b554-7259dbe4d0a2\") " pod="openshift-marketplace/community-operators-bbbjp" Dec 04 11:14:13 crc kubenswrapper[4693]: I1204 11:14:13.952559 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbbjp" Dec 04 11:14:14 crc kubenswrapper[4693]: I1204 11:14:14.450633 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-tvgv5_05ad9856-cd9f-4317-8c24-ebfa61baa56b/kube-rbac-proxy/0.log" Dec 04 11:14:14 crc kubenswrapper[4693]: W1204 11:14:14.505922 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64a545b9_f2e5_4114_b554_7259dbe4d0a2.slice/crio-079d51524d235ef363d1b79717d4d28bd67aa74b0ad4328e48d1f2d00613adf5 WatchSource:0}: Error finding container 079d51524d235ef363d1b79717d4d28bd67aa74b0ad4328e48d1f2d00613adf5: Status 404 returned error can't find the container with id 079d51524d235ef363d1b79717d4d28bd67aa74b0ad4328e48d1f2d00613adf5 Dec 04 11:14:14 crc kubenswrapper[4693]: I1204 11:14:14.518427 4693 generic.go:334] "Generic (PLEG): container finished" podID="48c2dd0e-da3a-40fa-b210-063c2c490fda" containerID="fe66ff248d57e9c38e3e81f6abd174820d973a906cbf3d22384e9656c28cb450" exitCode=0 Dec 04 11:14:14 crc kubenswrapper[4693]: I1204 11:14:14.518479 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpz8c" event={"ID":"48c2dd0e-da3a-40fa-b210-063c2c490fda","Type":"ContainerDied","Data":"fe66ff248d57e9c38e3e81f6abd174820d973a906cbf3d22384e9656c28cb450"} Dec 04 11:14:14 crc kubenswrapper[4693]: I1204 11:14:14.518524 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpz8c" event={"ID":"48c2dd0e-da3a-40fa-b210-063c2c490fda","Type":"ContainerStarted","Data":"1eaffe84af31b0f986fc3e1529ea5a395d66b4876b9db1fd202c4db1bd29e576"} Dec 04 11:14:14 crc kubenswrapper[4693]: I1204 11:14:14.522630 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bbbjp"] Dec 04 11:14:14 crc kubenswrapper[4693]: I1204 11:14:14.657855 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-tvgv5_05ad9856-cd9f-4317-8c24-ebfa61baa56b/controller/0.log" Dec 04 11:14:14 crc kubenswrapper[4693]: I1204 11:14:14.712110 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-frr-files/0.log" Dec 04 11:14:14 crc kubenswrapper[4693]: I1204 11:14:14.859430 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-frr-files/0.log" Dec 04 11:14:14 crc kubenswrapper[4693]: I1204 11:14:14.884256 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-metrics/0.log" Dec 04 11:14:14 crc kubenswrapper[4693]: I1204 11:14:14.908517 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-reloader/0.log" Dec 04 11:14:14 crc kubenswrapper[4693]: I1204 11:14:14.927359 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-reloader/0.log" Dec 04 11:14:15 crc kubenswrapper[4693]: I1204 11:14:15.153701 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-reloader/0.log" Dec 04 11:14:15 crc kubenswrapper[4693]: I1204 11:14:15.161025 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-frr-files/0.log" Dec 04 11:14:15 crc kubenswrapper[4693]: I1204 11:14:15.161066 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-metrics/0.log" Dec 04 11:14:15 crc kubenswrapper[4693]: I1204 11:14:15.176601 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-metrics/0.log" Dec 04 11:14:15 crc kubenswrapper[4693]: I1204 11:14:15.327945 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-frr-files/0.log" Dec 04 11:14:15 crc kubenswrapper[4693]: I1204 11:14:15.361477 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-metrics/0.log" Dec 04 11:14:15 crc kubenswrapper[4693]: I1204 11:14:15.361610 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-reloader/0.log" Dec 04 11:14:15 crc kubenswrapper[4693]: I1204 11:14:15.363313 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/controller/0.log" Dec 04 11:14:15 crc kubenswrapper[4693]: I1204 11:14:15.534554 4693 generic.go:334] "Generic (PLEG): container finished" podID="64a545b9-f2e5-4114-b554-7259dbe4d0a2" containerID="3eb54d024b7b9b2d1d665c29ba56e71c3e67b21b6d6baf2e5cf3a8ff082d75bf" exitCode=0 Dec 04 11:14:15 crc kubenswrapper[4693]: I1204 11:14:15.535113 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbbjp" event={"ID":"64a545b9-f2e5-4114-b554-7259dbe4d0a2","Type":"ContainerDied","Data":"3eb54d024b7b9b2d1d665c29ba56e71c3e67b21b6d6baf2e5cf3a8ff082d75bf"} Dec 04 11:14:15 crc kubenswrapper[4693]: I1204 11:14:15.535199 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbbjp" event={"ID":"64a545b9-f2e5-4114-b554-7259dbe4d0a2","Type":"ContainerStarted","Data":"079d51524d235ef363d1b79717d4d28bd67aa74b0ad4328e48d1f2d00613adf5"} Dec 04 11:14:15 crc kubenswrapper[4693]: I1204 11:14:15.581737 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/kube-rbac-proxy-frr/0.log" Dec 04 11:14:15 crc kubenswrapper[4693]: I1204 11:14:15.619190 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/frr-metrics/0.log" Dec 04 11:14:15 crc kubenswrapper[4693]: I1204 11:14:15.620419 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/kube-rbac-proxy/0.log" Dec 04 11:14:15 crc kubenswrapper[4693]: I1204 11:14:15.799978 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/reloader/0.log" Dec 04 11:14:15 crc kubenswrapper[4693]: I1204 11:14:15.867639 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-qdqd6_f57bca9c-8036-4ef7-8c6c-dcaa42b44c3a/frr-k8s-webhook-server/0.log" Dec 04 11:14:16 crc kubenswrapper[4693]: I1204 11:14:16.076292 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-cfc67b4f5-6t7cx_990d09c4-a9e4-4233-ae12-3910f2937270/manager/0.log" Dec 04 11:14:16 crc kubenswrapper[4693]: I1204 11:14:16.282075 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5f644b44db-65csk_d7810505-b15b-4970-8cf0-f7217394a1ca/webhook-server/0.log" Dec 04 11:14:16 crc kubenswrapper[4693]: I1204 11:14:16.365056 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-f7pnf_8cb6de38-296b-415b-8f7c-aa037586a5db/kube-rbac-proxy/0.log" Dec 04 11:14:16 crc kubenswrapper[4693]: I1204 11:14:16.556987 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpz8c" event={"ID":"48c2dd0e-da3a-40fa-b210-063c2c490fda","Type":"ContainerStarted","Data":"eebabc1807b4318eb43f338a7c605f3e7f608773f458e953793a6a9e2337fd00"} Dec 04 11:14:17 crc kubenswrapper[4693]: I1204 11:14:17.270619 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-f7pnf_8cb6de38-296b-415b-8f7c-aa037586a5db/speaker/0.log" Dec 04 11:14:17 crc kubenswrapper[4693]: I1204 11:14:17.462696 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:14:17 crc kubenswrapper[4693]: E1204 11:14:17.467418 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:14:17 crc kubenswrapper[4693]: I1204 11:14:17.516573 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/frr/0.log" Dec 04 11:14:17 crc kubenswrapper[4693]: I1204 11:14:17.566289 4693 generic.go:334] "Generic (PLEG): container finished" podID="64a545b9-f2e5-4114-b554-7259dbe4d0a2" containerID="a04d1ed147f6119a313e6b362d8f75e71d144632f3a0ae37aba272cf49715774" exitCode=0 Dec 04 11:14:17 crc kubenswrapper[4693]: I1204 11:14:17.566398 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbbjp" event={"ID":"64a545b9-f2e5-4114-b554-7259dbe4d0a2","Type":"ContainerDied","Data":"a04d1ed147f6119a313e6b362d8f75e71d144632f3a0ae37aba272cf49715774"} Dec 04 11:14:20 crc kubenswrapper[4693]: I1204 11:14:20.611971 4693 generic.go:334] "Generic (PLEG): container finished" podID="48c2dd0e-da3a-40fa-b210-063c2c490fda" containerID="eebabc1807b4318eb43f338a7c605f3e7f608773f458e953793a6a9e2337fd00" exitCode=0 Dec 04 11:14:20 crc kubenswrapper[4693]: I1204 11:14:20.612161 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpz8c" event={"ID":"48c2dd0e-da3a-40fa-b210-063c2c490fda","Type":"ContainerDied","Data":"eebabc1807b4318eb43f338a7c605f3e7f608773f458e953793a6a9e2337fd00"} Dec 04 11:14:21 crc kubenswrapper[4693]: I1204 11:14:21.625206 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpz8c" event={"ID":"48c2dd0e-da3a-40fa-b210-063c2c490fda","Type":"ContainerStarted","Data":"69b1d4d69a1d973eead9ea16ac3f8d338ecf640d5e7e15ac2390389d681446bc"} Dec 04 11:14:21 crc kubenswrapper[4693]: I1204 11:14:21.628127 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbbjp" event={"ID":"64a545b9-f2e5-4114-b554-7259dbe4d0a2","Type":"ContainerStarted","Data":"c2fb659defbb83e2e081ee6b17ddcfcfed8e1260db7007d9d2e668e4dbb1e890"} Dec 04 11:14:21 crc kubenswrapper[4693]: I1204 11:14:21.656877 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wpz8c" podStartSLOduration=2.812748205 podStartE2EDuration="9.656859029s" podCreationTimestamp="2025-12-04 11:14:12 +0000 UTC" firstStartedPulling="2025-12-04 11:14:14.5197256 +0000 UTC m=+5500.417319353" lastFinishedPulling="2025-12-04 11:14:21.363836424 +0000 UTC m=+5507.261430177" observedRunningTime="2025-12-04 11:14:21.647093678 +0000 UTC m=+5507.544687431" watchObservedRunningTime="2025-12-04 11:14:21.656859029 +0000 UTC m=+5507.554452782" Dec 04 11:14:21 crc kubenswrapper[4693]: I1204 11:14:21.668772 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bbbjp" podStartSLOduration=2.943533162 podStartE2EDuration="8.668751827s" podCreationTimestamp="2025-12-04 11:14:13 +0000 UTC" firstStartedPulling="2025-12-04 11:14:15.537088827 +0000 UTC m=+5501.434682580" lastFinishedPulling="2025-12-04 11:14:21.262307492 +0000 UTC m=+5507.159901245" observedRunningTime="2025-12-04 11:14:21.666474796 +0000 UTC m=+5507.564068549" watchObservedRunningTime="2025-12-04 11:14:21.668751827 +0000 UTC m=+5507.566345580" Dec 04 11:14:22 crc kubenswrapper[4693]: I1204 11:14:22.897512 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wpz8c" Dec 04 11:14:22 crc kubenswrapper[4693]: I1204 11:14:22.897868 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wpz8c" Dec 04 11:14:23 crc kubenswrapper[4693]: I1204 11:14:23.952063 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wpz8c" podUID="48c2dd0e-da3a-40fa-b210-063c2c490fda" containerName="registry-server" probeResult="failure" output=< Dec 04 11:14:23 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Dec 04 11:14:23 crc kubenswrapper[4693]: > Dec 04 11:14:23 crc kubenswrapper[4693]: I1204 11:14:23.953372 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bbbjp" Dec 04 11:14:23 crc kubenswrapper[4693]: I1204 11:14:23.953411 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bbbjp" Dec 04 11:14:23 crc kubenswrapper[4693]: I1204 11:14:23.998448 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bbbjp" Dec 04 11:14:27 crc kubenswrapper[4693]: I1204 11:14:27.550998 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xhf5n"] Dec 04 11:14:27 crc kubenswrapper[4693]: I1204 11:14:27.556090 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhf5n" Dec 04 11:14:27 crc kubenswrapper[4693]: I1204 11:14:27.564421 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xhf5n"] Dec 04 11:14:27 crc kubenswrapper[4693]: I1204 11:14:27.644823 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f904038-2816-4047-a48a-2ad5bca195d1-catalog-content\") pod \"certified-operators-xhf5n\" (UID: \"8f904038-2816-4047-a48a-2ad5bca195d1\") " pod="openshift-marketplace/certified-operators-xhf5n" Dec 04 11:14:27 crc kubenswrapper[4693]: I1204 11:14:27.645118 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f904038-2816-4047-a48a-2ad5bca195d1-utilities\") pod \"certified-operators-xhf5n\" (UID: \"8f904038-2816-4047-a48a-2ad5bca195d1\") " pod="openshift-marketplace/certified-operators-xhf5n" Dec 04 11:14:27 crc kubenswrapper[4693]: I1204 11:14:27.645412 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhp8k\" (UniqueName: \"kubernetes.io/projected/8f904038-2816-4047-a48a-2ad5bca195d1-kube-api-access-rhp8k\") pod \"certified-operators-xhf5n\" (UID: \"8f904038-2816-4047-a48a-2ad5bca195d1\") " pod="openshift-marketplace/certified-operators-xhf5n" Dec 04 11:14:27 crc kubenswrapper[4693]: I1204 11:14:27.746903 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f904038-2816-4047-a48a-2ad5bca195d1-catalog-content\") pod \"certified-operators-xhf5n\" (UID: \"8f904038-2816-4047-a48a-2ad5bca195d1\") " pod="openshift-marketplace/certified-operators-xhf5n" Dec 04 11:14:27 crc kubenswrapper[4693]: I1204 11:14:27.747236 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f904038-2816-4047-a48a-2ad5bca195d1-utilities\") pod \"certified-operators-xhf5n\" (UID: \"8f904038-2816-4047-a48a-2ad5bca195d1\") " pod="openshift-marketplace/certified-operators-xhf5n" Dec 04 11:14:27 crc kubenswrapper[4693]: I1204 11:14:27.747283 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhp8k\" (UniqueName: \"kubernetes.io/projected/8f904038-2816-4047-a48a-2ad5bca195d1-kube-api-access-rhp8k\") pod \"certified-operators-xhf5n\" (UID: \"8f904038-2816-4047-a48a-2ad5bca195d1\") " pod="openshift-marketplace/certified-operators-xhf5n" Dec 04 11:14:27 crc kubenswrapper[4693]: I1204 11:14:27.747714 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f904038-2816-4047-a48a-2ad5bca195d1-catalog-content\") pod \"certified-operators-xhf5n\" (UID: \"8f904038-2816-4047-a48a-2ad5bca195d1\") " pod="openshift-marketplace/certified-operators-xhf5n" Dec 04 11:14:27 crc kubenswrapper[4693]: I1204 11:14:27.747738 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f904038-2816-4047-a48a-2ad5bca195d1-utilities\") pod \"certified-operators-xhf5n\" (UID: \"8f904038-2816-4047-a48a-2ad5bca195d1\") " pod="openshift-marketplace/certified-operators-xhf5n" Dec 04 11:14:27 crc kubenswrapper[4693]: I1204 11:14:27.774799 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhp8k\" (UniqueName: \"kubernetes.io/projected/8f904038-2816-4047-a48a-2ad5bca195d1-kube-api-access-rhp8k\") pod \"certified-operators-xhf5n\" (UID: \"8f904038-2816-4047-a48a-2ad5bca195d1\") " pod="openshift-marketplace/certified-operators-xhf5n" Dec 04 11:14:27 crc kubenswrapper[4693]: I1204 11:14:27.883776 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhf5n" Dec 04 11:14:28 crc kubenswrapper[4693]: I1204 11:14:28.168415 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sf47l"] Dec 04 11:14:28 crc kubenswrapper[4693]: I1204 11:14:28.170666 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sf47l" Dec 04 11:14:28 crc kubenswrapper[4693]: I1204 11:14:28.178000 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sf47l"] Dec 04 11:14:28 crc kubenswrapper[4693]: I1204 11:14:28.258635 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0803114-c7c2-4f69-bf87-981f0ae84aff-utilities\") pod \"redhat-marketplace-sf47l\" (UID: \"e0803114-c7c2-4f69-bf87-981f0ae84aff\") " pod="openshift-marketplace/redhat-marketplace-sf47l" Dec 04 11:14:28 crc kubenswrapper[4693]: I1204 11:14:28.258699 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlqh6\" (UniqueName: \"kubernetes.io/projected/e0803114-c7c2-4f69-bf87-981f0ae84aff-kube-api-access-tlqh6\") pod \"redhat-marketplace-sf47l\" (UID: \"e0803114-c7c2-4f69-bf87-981f0ae84aff\") " pod="openshift-marketplace/redhat-marketplace-sf47l" Dec 04 11:14:28 crc kubenswrapper[4693]: I1204 11:14:28.258900 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0803114-c7c2-4f69-bf87-981f0ae84aff-catalog-content\") pod \"redhat-marketplace-sf47l\" (UID: \"e0803114-c7c2-4f69-bf87-981f0ae84aff\") " pod="openshift-marketplace/redhat-marketplace-sf47l" Dec 04 11:14:28 crc kubenswrapper[4693]: I1204 11:14:28.360586 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0803114-c7c2-4f69-bf87-981f0ae84aff-catalog-content\") pod \"redhat-marketplace-sf47l\" (UID: \"e0803114-c7c2-4f69-bf87-981f0ae84aff\") " pod="openshift-marketplace/redhat-marketplace-sf47l" Dec 04 11:14:28 crc kubenswrapper[4693]: I1204 11:14:28.360863 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0803114-c7c2-4f69-bf87-981f0ae84aff-utilities\") pod \"redhat-marketplace-sf47l\" (UID: \"e0803114-c7c2-4f69-bf87-981f0ae84aff\") " pod="openshift-marketplace/redhat-marketplace-sf47l" Dec 04 11:14:28 crc kubenswrapper[4693]: I1204 11:14:28.360900 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlqh6\" (UniqueName: \"kubernetes.io/projected/e0803114-c7c2-4f69-bf87-981f0ae84aff-kube-api-access-tlqh6\") pod \"redhat-marketplace-sf47l\" (UID: \"e0803114-c7c2-4f69-bf87-981f0ae84aff\") " pod="openshift-marketplace/redhat-marketplace-sf47l" Dec 04 11:14:28 crc kubenswrapper[4693]: I1204 11:14:28.361817 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0803114-c7c2-4f69-bf87-981f0ae84aff-catalog-content\") pod \"redhat-marketplace-sf47l\" (UID: \"e0803114-c7c2-4f69-bf87-981f0ae84aff\") " pod="openshift-marketplace/redhat-marketplace-sf47l" Dec 04 11:14:28 crc kubenswrapper[4693]: I1204 11:14:28.362023 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0803114-c7c2-4f69-bf87-981f0ae84aff-utilities\") pod \"redhat-marketplace-sf47l\" (UID: \"e0803114-c7c2-4f69-bf87-981f0ae84aff\") " pod="openshift-marketplace/redhat-marketplace-sf47l" Dec 04 11:14:28 crc kubenswrapper[4693]: I1204 11:14:28.390145 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlqh6\" (UniqueName: \"kubernetes.io/projected/e0803114-c7c2-4f69-bf87-981f0ae84aff-kube-api-access-tlqh6\") pod \"redhat-marketplace-sf47l\" (UID: \"e0803114-c7c2-4f69-bf87-981f0ae84aff\") " pod="openshift-marketplace/redhat-marketplace-sf47l" Dec 04 11:14:28 crc kubenswrapper[4693]: I1204 11:14:28.438138 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xhf5n"] Dec 04 11:14:28 crc kubenswrapper[4693]: W1204 11:14:28.451893 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f904038_2816_4047_a48a_2ad5bca195d1.slice/crio-8e61a51a467257f933ce3c42a36266e01f3064b7068c60ccf983b2e2bda8c52d WatchSource:0}: Error finding container 8e61a51a467257f933ce3c42a36266e01f3064b7068c60ccf983b2e2bda8c52d: Status 404 returned error can't find the container with id 8e61a51a467257f933ce3c42a36266e01f3064b7068c60ccf983b2e2bda8c52d Dec 04 11:14:28 crc kubenswrapper[4693]: I1204 11:14:28.513270 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sf47l" Dec 04 11:14:28 crc kubenswrapper[4693]: I1204 11:14:28.699122 4693 generic.go:334] "Generic (PLEG): container finished" podID="8f904038-2816-4047-a48a-2ad5bca195d1" containerID="45cbe399558bd8b0bd7cd1936d37b75ee7b99fcf38ce72dcb98ea2f7d4f0ca91" exitCode=0 Dec 04 11:14:28 crc kubenswrapper[4693]: I1204 11:14:28.699304 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhf5n" event={"ID":"8f904038-2816-4047-a48a-2ad5bca195d1","Type":"ContainerDied","Data":"45cbe399558bd8b0bd7cd1936d37b75ee7b99fcf38ce72dcb98ea2f7d4f0ca91"} Dec 04 11:14:28 crc kubenswrapper[4693]: I1204 11:14:28.699460 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhf5n" event={"ID":"8f904038-2816-4047-a48a-2ad5bca195d1","Type":"ContainerStarted","Data":"8e61a51a467257f933ce3c42a36266e01f3064b7068c60ccf983b2e2bda8c52d"} Dec 04 11:14:29 crc kubenswrapper[4693]: I1204 11:14:29.038916 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sf47l"] Dec 04 11:14:29 crc kubenswrapper[4693]: I1204 11:14:29.722991 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhf5n" event={"ID":"8f904038-2816-4047-a48a-2ad5bca195d1","Type":"ContainerStarted","Data":"dad58f727da88c4218f9496bc4af0697dae2cf2f128eb8419d17f8d56f5fb73a"} Dec 04 11:14:29 crc kubenswrapper[4693]: I1204 11:14:29.724969 4693 generic.go:334] "Generic (PLEG): container finished" podID="e0803114-c7c2-4f69-bf87-981f0ae84aff" containerID="abb6f2e482cb9f4da328223b2dc66622dc4f121c3af07200a0a99abcd548d0bd" exitCode=0 Dec 04 11:14:29 crc kubenswrapper[4693]: I1204 11:14:29.725027 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sf47l" event={"ID":"e0803114-c7c2-4f69-bf87-981f0ae84aff","Type":"ContainerDied","Data":"abb6f2e482cb9f4da328223b2dc66622dc4f121c3af07200a0a99abcd548d0bd"} Dec 04 11:14:29 crc kubenswrapper[4693]: I1204 11:14:29.725057 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sf47l" event={"ID":"e0803114-c7c2-4f69-bf87-981f0ae84aff","Type":"ContainerStarted","Data":"0e40e98454b490349edad0dce2b3804aef6ba42568c63192e69570cec1befc5b"} Dec 04 11:14:29 crc kubenswrapper[4693]: I1204 11:14:29.870693 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf_e2ff9719-16f5-410e-af26-8dadc807ca7e/util/0.log" Dec 04 11:14:30 crc kubenswrapper[4693]: I1204 11:14:30.095695 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf_e2ff9719-16f5-410e-af26-8dadc807ca7e/util/0.log" Dec 04 11:14:30 crc kubenswrapper[4693]: I1204 11:14:30.115533 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf_e2ff9719-16f5-410e-af26-8dadc807ca7e/pull/0.log" Dec 04 11:14:30 crc kubenswrapper[4693]: I1204 11:14:30.133677 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf_e2ff9719-16f5-410e-af26-8dadc807ca7e/pull/0.log" Dec 04 11:14:30 crc kubenswrapper[4693]: I1204 11:14:30.380564 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf_e2ff9719-16f5-410e-af26-8dadc807ca7e/util/0.log" Dec 04 11:14:30 crc kubenswrapper[4693]: I1204 11:14:30.396713 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf_e2ff9719-16f5-410e-af26-8dadc807ca7e/pull/0.log" Dec 04 11:14:30 crc kubenswrapper[4693]: I1204 11:14:30.774591 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw_b6e8fbed-f202-40ea-ad04-5bbbb338f8e1/util/0.log" Dec 04 11:14:30 crc kubenswrapper[4693]: I1204 11:14:30.774617 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw_b6e8fbed-f202-40ea-ad04-5bbbb338f8e1/util/0.log" Dec 04 11:14:30 crc kubenswrapper[4693]: I1204 11:14:30.774848 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf_e2ff9719-16f5-410e-af26-8dadc807ca7e/extract/0.log" Dec 04 11:14:30 crc kubenswrapper[4693]: I1204 11:14:30.775595 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw_b6e8fbed-f202-40ea-ad04-5bbbb338f8e1/pull/0.log" Dec 04 11:14:31 crc kubenswrapper[4693]: I1204 11:14:31.103010 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw_b6e8fbed-f202-40ea-ad04-5bbbb338f8e1/util/0.log" Dec 04 11:14:31 crc kubenswrapper[4693]: I1204 11:14:31.112281 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw_b6e8fbed-f202-40ea-ad04-5bbbb338f8e1/pull/0.log" Dec 04 11:14:31 crc kubenswrapper[4693]: I1204 11:14:31.145833 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw_b6e8fbed-f202-40ea-ad04-5bbbb338f8e1/pull/0.log" Dec 04 11:14:31 crc kubenswrapper[4693]: I1204 11:14:31.146011 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw_b6e8fbed-f202-40ea-ad04-5bbbb338f8e1/extract/0.log" Dec 04 11:14:31 crc kubenswrapper[4693]: I1204 11:14:31.343269 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xd4kh_96d24e68-be12-479a-8354-d30848a5b7e1/extract-utilities/0.log" Dec 04 11:14:31 crc kubenswrapper[4693]: I1204 11:14:31.540874 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xd4kh_96d24e68-be12-479a-8354-d30848a5b7e1/extract-content/0.log" Dec 04 11:14:31 crc kubenswrapper[4693]: I1204 11:14:31.543754 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xd4kh_96d24e68-be12-479a-8354-d30848a5b7e1/extract-utilities/0.log" Dec 04 11:14:31 crc kubenswrapper[4693]: I1204 11:14:31.562006 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xd4kh_96d24e68-be12-479a-8354-d30848a5b7e1/extract-content/0.log" Dec 04 11:14:31 crc kubenswrapper[4693]: I1204 11:14:31.741860 4693 generic.go:334] "Generic (PLEG): container finished" podID="8f904038-2816-4047-a48a-2ad5bca195d1" containerID="dad58f727da88c4218f9496bc4af0697dae2cf2f128eb8419d17f8d56f5fb73a" exitCode=0 Dec 04 11:14:31 crc kubenswrapper[4693]: I1204 11:14:31.741902 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhf5n" event={"ID":"8f904038-2816-4047-a48a-2ad5bca195d1","Type":"ContainerDied","Data":"dad58f727da88c4218f9496bc4af0697dae2cf2f128eb8419d17f8d56f5fb73a"} Dec 04 11:14:31 crc kubenswrapper[4693]: I1204 11:14:31.831267 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xd4kh_96d24e68-be12-479a-8354-d30848a5b7e1/extract-content/0.log" Dec 04 11:14:31 crc kubenswrapper[4693]: I1204 11:14:31.840737 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xd4kh_96d24e68-be12-479a-8354-d30848a5b7e1/extract-utilities/0.log" Dec 04 11:14:32 crc kubenswrapper[4693]: I1204 11:14:32.040811 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xhf5n_8f904038-2816-4047-a48a-2ad5bca195d1/extract-utilities/0.log" Dec 04 11:14:32 crc kubenswrapper[4693]: I1204 11:14:32.334506 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xhf5n_8f904038-2816-4047-a48a-2ad5bca195d1/extract-utilities/0.log" Dec 04 11:14:32 crc kubenswrapper[4693]: I1204 11:14:32.404849 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xhf5n_8f904038-2816-4047-a48a-2ad5bca195d1/extract-content/0.log" Dec 04 11:14:32 crc kubenswrapper[4693]: I1204 11:14:32.435560 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xhf5n_8f904038-2816-4047-a48a-2ad5bca195d1/extract-content/0.log" Dec 04 11:14:32 crc kubenswrapper[4693]: I1204 11:14:32.461083 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:14:32 crc kubenswrapper[4693]: I1204 11:14:32.554283 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xd4kh_96d24e68-be12-479a-8354-d30848a5b7e1/registry-server/0.log" Dec 04 11:14:32 crc kubenswrapper[4693]: I1204 11:14:32.590203 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xhf5n_8f904038-2816-4047-a48a-2ad5bca195d1/extract-utilities/0.log" Dec 04 11:14:32 crc kubenswrapper[4693]: I1204 11:14:32.618603 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xhf5n_8f904038-2816-4047-a48a-2ad5bca195d1/extract-content/0.log" Dec 04 11:14:32 crc kubenswrapper[4693]: I1204 11:14:32.754204 4693 generic.go:334] "Generic (PLEG): container finished" podID="e0803114-c7c2-4f69-bf87-981f0ae84aff" containerID="3f823bccab2c8c5a26c312b4a7d22cd541e5fddd3ada44aa8b4e06e77133f3fd" exitCode=0 Dec 04 11:14:32 crc kubenswrapper[4693]: I1204 11:14:32.754283 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sf47l" event={"ID":"e0803114-c7c2-4f69-bf87-981f0ae84aff","Type":"ContainerDied","Data":"3f823bccab2c8c5a26c312b4a7d22cd541e5fddd3ada44aa8b4e06e77133f3fd"} Dec 04 11:14:32 crc kubenswrapper[4693]: I1204 11:14:32.760305 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhf5n" event={"ID":"8f904038-2816-4047-a48a-2ad5bca195d1","Type":"ContainerStarted","Data":"0149853cbc8adf0ea71815603b788829820ea1e08119d1d6b4c95f6a91ee8974"} Dec 04 11:14:32 crc kubenswrapper[4693]: I1204 11:14:32.763666 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"37f5b4da1eac4bd8530c08f10e1ab470ebfac38f4d7d7da5524b314f5c190a60"} Dec 04 11:14:32 crc kubenswrapper[4693]: I1204 11:14:32.767199 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xhf5n_8f904038-2816-4047-a48a-2ad5bca195d1/registry-server/0.log" Dec 04 11:14:32 crc kubenswrapper[4693]: I1204 11:14:32.806606 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xhf5n" podStartSLOduration=2.315192335 podStartE2EDuration="5.806590549s" podCreationTimestamp="2025-12-04 11:14:27 +0000 UTC" firstStartedPulling="2025-12-04 11:14:28.701193209 +0000 UTC m=+5514.598786962" lastFinishedPulling="2025-12-04 11:14:32.192591423 +0000 UTC m=+5518.090185176" observedRunningTime="2025-12-04 11:14:32.80137506 +0000 UTC m=+5518.698968823" watchObservedRunningTime="2025-12-04 11:14:32.806590549 +0000 UTC m=+5518.704184302" Dec 04 11:14:32 crc kubenswrapper[4693]: I1204 11:14:32.882224 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bbbjp_64a545b9-f2e5-4114-b554-7259dbe4d0a2/extract-utilities/0.log" Dec 04 11:14:32 crc kubenswrapper[4693]: I1204 11:14:32.963057 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wpz8c" Dec 04 11:14:33 crc kubenswrapper[4693]: I1204 11:14:33.022933 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wpz8c" Dec 04 11:14:33 crc kubenswrapper[4693]: I1204 11:14:33.123107 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bbbjp_64a545b9-f2e5-4114-b554-7259dbe4d0a2/extract-content/0.log" Dec 04 11:14:33 crc kubenswrapper[4693]: I1204 11:14:33.209227 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bbbjp_64a545b9-f2e5-4114-b554-7259dbe4d0a2/extract-utilities/0.log" Dec 04 11:14:33 crc kubenswrapper[4693]: I1204 11:14:33.222738 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bbbjp_64a545b9-f2e5-4114-b554-7259dbe4d0a2/extract-content/0.log" Dec 04 11:14:33 crc kubenswrapper[4693]: I1204 11:14:33.440918 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bbbjp_64a545b9-f2e5-4114-b554-7259dbe4d0a2/extract-utilities/0.log" Dec 04 11:14:33 crc kubenswrapper[4693]: I1204 11:14:33.441118 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bbbjp_64a545b9-f2e5-4114-b554-7259dbe4d0a2/extract-content/0.log" Dec 04 11:14:33 crc kubenswrapper[4693]: I1204 11:14:33.441383 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bbbjp_64a545b9-f2e5-4114-b554-7259dbe4d0a2/registry-server/0.log" Dec 04 11:14:33 crc kubenswrapper[4693]: I1204 11:14:33.623254 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mm9jb_12ac24d7-f1d6-48c9-95cb-ac54e898ad99/extract-utilities/0.log" Dec 04 11:14:33 crc kubenswrapper[4693]: I1204 11:14:33.885042 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mm9jb_12ac24d7-f1d6-48c9-95cb-ac54e898ad99/extract-content/0.log" Dec 04 11:14:33 crc kubenswrapper[4693]: I1204 11:14:33.885446 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mm9jb_12ac24d7-f1d6-48c9-95cb-ac54e898ad99/extract-content/0.log" Dec 04 11:14:33 crc kubenswrapper[4693]: I1204 11:14:33.899541 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mm9jb_12ac24d7-f1d6-48c9-95cb-ac54e898ad99/extract-utilities/0.log" Dec 04 11:14:34 crc kubenswrapper[4693]: I1204 11:14:34.021343 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bbbjp" Dec 04 11:14:34 crc kubenswrapper[4693]: I1204 11:14:34.141582 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mm9jb_12ac24d7-f1d6-48c9-95cb-ac54e898ad99/extract-utilities/0.log" Dec 04 11:14:34 crc kubenswrapper[4693]: I1204 11:14:34.182012 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mm9jb_12ac24d7-f1d6-48c9-95cb-ac54e898ad99/extract-content/0.log" Dec 04 11:14:34 crc kubenswrapper[4693]: I1204 11:14:34.234878 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-26frc_57e8fd24-01fe-42d0-9bd6-6066003c724b/marketplace-operator/0.log" Dec 04 11:14:34 crc kubenswrapper[4693]: I1204 11:14:34.499039 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bt2sw_07fa793f-d0d3-469c-8110-cd6e594d40b9/extract-utilities/0.log" Dec 04 11:14:34 crc kubenswrapper[4693]: I1204 11:14:34.714437 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bt2sw_07fa793f-d0d3-469c-8110-cd6e594d40b9/extract-content/0.log" Dec 04 11:14:34 crc kubenswrapper[4693]: I1204 11:14:34.781813 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bt2sw_07fa793f-d0d3-469c-8110-cd6e594d40b9/extract-utilities/0.log" Dec 04 11:14:34 crc kubenswrapper[4693]: I1204 11:14:34.795564 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sf47l" event={"ID":"e0803114-c7c2-4f69-bf87-981f0ae84aff","Type":"ContainerStarted","Data":"71780c4797f7cdf970d4fd950d761fdf56dc23e4ae726eb7722b33be552a011c"} Dec 04 11:14:34 crc kubenswrapper[4693]: I1204 11:14:34.802444 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bt2sw_07fa793f-d0d3-469c-8110-cd6e594d40b9/extract-content/0.log" Dec 04 11:14:34 crc kubenswrapper[4693]: I1204 11:14:34.834615 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sf47l" podStartSLOduration=2.809305934 podStartE2EDuration="6.834501652s" podCreationTimestamp="2025-12-04 11:14:28 +0000 UTC" firstStartedPulling="2025-12-04 11:14:29.729942251 +0000 UTC m=+5515.627536004" lastFinishedPulling="2025-12-04 11:14:33.755137959 +0000 UTC m=+5519.652731722" observedRunningTime="2025-12-04 11:14:34.814840017 +0000 UTC m=+5520.712433770" watchObservedRunningTime="2025-12-04 11:14:34.834501652 +0000 UTC m=+5520.732095405" Dec 04 11:14:35 crc kubenswrapper[4693]: I1204 11:14:35.145201 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bt2sw_07fa793f-d0d3-469c-8110-cd6e594d40b9/extract-utilities/0.log" Dec 04 11:14:35 crc kubenswrapper[4693]: I1204 11:14:35.168781 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bt2sw_07fa793f-d0d3-469c-8110-cd6e594d40b9/extract-content/0.log" Dec 04 11:14:35 crc kubenswrapper[4693]: I1204 11:14:35.316640 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bt2sw_07fa793f-d0d3-469c-8110-cd6e594d40b9/registry-server/0.log" Dec 04 11:14:35 crc kubenswrapper[4693]: I1204 11:14:35.430487 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mm9jb_12ac24d7-f1d6-48c9-95cb-ac54e898ad99/registry-server/0.log" Dec 04 11:14:35 crc kubenswrapper[4693]: I1204 11:14:35.430625 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sf47l_e0803114-c7c2-4f69-bf87-981f0ae84aff/extract-utilities/0.log" Dec 04 11:14:35 crc kubenswrapper[4693]: I1204 11:14:35.587726 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sf47l_e0803114-c7c2-4f69-bf87-981f0ae84aff/extract-content/0.log" Dec 04 11:14:35 crc kubenswrapper[4693]: I1204 11:14:35.595239 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sf47l_e0803114-c7c2-4f69-bf87-981f0ae84aff/extract-content/0.log" Dec 04 11:14:35 crc kubenswrapper[4693]: I1204 11:14:35.612831 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sf47l_e0803114-c7c2-4f69-bf87-981f0ae84aff/extract-utilities/0.log" Dec 04 11:14:35 crc kubenswrapper[4693]: I1204 11:14:35.840773 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sf47l_e0803114-c7c2-4f69-bf87-981f0ae84aff/extract-utilities/0.log" Dec 04 11:14:35 crc kubenswrapper[4693]: I1204 11:14:35.845577 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sf47l_e0803114-c7c2-4f69-bf87-981f0ae84aff/extract-content/0.log" Dec 04 11:14:35 crc kubenswrapper[4693]: I1204 11:14:35.900412 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sf47l_e0803114-c7c2-4f69-bf87-981f0ae84aff/registry-server/0.log" Dec 04 11:14:35 crc kubenswrapper[4693]: I1204 11:14:35.916453 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wpz8c_48c2dd0e-da3a-40fa-b210-063c2c490fda/extract-utilities/0.log" Dec 04 11:14:36 crc kubenswrapper[4693]: I1204 11:14:36.146267 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wpz8c_48c2dd0e-da3a-40fa-b210-063c2c490fda/extract-content/0.log" Dec 04 11:14:36 crc kubenswrapper[4693]: I1204 11:14:36.174763 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wpz8c_48c2dd0e-da3a-40fa-b210-063c2c490fda/extract-content/0.log" Dec 04 11:14:36 crc kubenswrapper[4693]: I1204 11:14:36.187855 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wpz8c_48c2dd0e-da3a-40fa-b210-063c2c490fda/extract-utilities/0.log" Dec 04 11:14:36 crc kubenswrapper[4693]: I1204 11:14:36.374291 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wpz8c_48c2dd0e-da3a-40fa-b210-063c2c490fda/extract-content/0.log" Dec 04 11:14:36 crc kubenswrapper[4693]: I1204 11:14:36.390866 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wpz8c_48c2dd0e-da3a-40fa-b210-063c2c490fda/extract-utilities/0.log" Dec 04 11:14:36 crc kubenswrapper[4693]: I1204 11:14:36.458097 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wpz8c_48c2dd0e-da3a-40fa-b210-063c2c490fda/registry-server/0.log" Dec 04 11:14:36 crc kubenswrapper[4693]: I1204 11:14:36.490627 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zh65d_8ec6de5e-a0e3-440b-be6e-336aef2a5a24/extract-utilities/0.log" Dec 04 11:14:36 crc kubenswrapper[4693]: I1204 11:14:36.657086 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zh65d_8ec6de5e-a0e3-440b-be6e-336aef2a5a24/extract-utilities/0.log" Dec 04 11:14:36 crc kubenswrapper[4693]: I1204 11:14:36.674007 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zh65d_8ec6de5e-a0e3-440b-be6e-336aef2a5a24/extract-content/0.log" Dec 04 11:14:36 crc kubenswrapper[4693]: I1204 11:14:36.701830 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zh65d_8ec6de5e-a0e3-440b-be6e-336aef2a5a24/extract-content/0.log" Dec 04 11:14:36 crc kubenswrapper[4693]: I1204 11:14:36.860383 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zh65d_8ec6de5e-a0e3-440b-be6e-336aef2a5a24/extract-utilities/0.log" Dec 04 11:14:36 crc kubenswrapper[4693]: I1204 11:14:36.882230 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zh65d_8ec6de5e-a0e3-440b-be6e-336aef2a5a24/extract-content/0.log" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.143592 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wpz8c"] Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.144288 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wpz8c" podUID="48c2dd0e-da3a-40fa-b210-063c2c490fda" containerName="registry-server" containerID="cri-o://69b1d4d69a1d973eead9ea16ac3f8d338ecf640d5e7e15ac2390389d681446bc" gracePeriod=2 Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.353032 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bbbjp"] Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.353965 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bbbjp" podUID="64a545b9-f2e5-4114-b554-7259dbe4d0a2" containerName="registry-server" containerID="cri-o://c2fb659defbb83e2e081ee6b17ddcfcfed8e1260db7007d9d2e668e4dbb1e890" gracePeriod=2 Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.747183 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zh65d_8ec6de5e-a0e3-440b-be6e-336aef2a5a24/registry-server/0.log" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.763121 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wpz8c" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.828542 4693 generic.go:334] "Generic (PLEG): container finished" podID="48c2dd0e-da3a-40fa-b210-063c2c490fda" containerID="69b1d4d69a1d973eead9ea16ac3f8d338ecf640d5e7e15ac2390389d681446bc" exitCode=0 Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.828663 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpz8c" event={"ID":"48c2dd0e-da3a-40fa-b210-063c2c490fda","Type":"ContainerDied","Data":"69b1d4d69a1d973eead9ea16ac3f8d338ecf640d5e7e15ac2390389d681446bc"} Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.828727 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpz8c" event={"ID":"48c2dd0e-da3a-40fa-b210-063c2c490fda","Type":"ContainerDied","Data":"1eaffe84af31b0f986fc3e1529ea5a395d66b4876b9db1fd202c4db1bd29e576"} Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.828750 4693 scope.go:117] "RemoveContainer" containerID="69b1d4d69a1d973eead9ea16ac3f8d338ecf640d5e7e15ac2390389d681446bc" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.830612 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wpz8c" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.839841 4693 generic.go:334] "Generic (PLEG): container finished" podID="64a545b9-f2e5-4114-b554-7259dbe4d0a2" containerID="c2fb659defbb83e2e081ee6b17ddcfcfed8e1260db7007d9d2e668e4dbb1e890" exitCode=0 Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.839895 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbbjp" event={"ID":"64a545b9-f2e5-4114-b554-7259dbe4d0a2","Type":"ContainerDied","Data":"c2fb659defbb83e2e081ee6b17ddcfcfed8e1260db7007d9d2e668e4dbb1e890"} Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.875870 4693 scope.go:117] "RemoveContainer" containerID="eebabc1807b4318eb43f338a7c605f3e7f608773f458e953793a6a9e2337fd00" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.877209 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfjh7\" (UniqueName: \"kubernetes.io/projected/48c2dd0e-da3a-40fa-b210-063c2c490fda-kube-api-access-pfjh7\") pod \"48c2dd0e-da3a-40fa-b210-063c2c490fda\" (UID: \"48c2dd0e-da3a-40fa-b210-063c2c490fda\") " Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.877561 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48c2dd0e-da3a-40fa-b210-063c2c490fda-utilities\") pod \"48c2dd0e-da3a-40fa-b210-063c2c490fda\" (UID: \"48c2dd0e-da3a-40fa-b210-063c2c490fda\") " Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.877645 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48c2dd0e-da3a-40fa-b210-063c2c490fda-catalog-content\") pod \"48c2dd0e-da3a-40fa-b210-063c2c490fda\" (UID: \"48c2dd0e-da3a-40fa-b210-063c2c490fda\") " Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.878779 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c2dd0e-da3a-40fa-b210-063c2c490fda-utilities" (OuterVolumeSpecName: "utilities") pod "48c2dd0e-da3a-40fa-b210-063c2c490fda" (UID: "48c2dd0e-da3a-40fa-b210-063c2c490fda"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.884472 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xhf5n" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.884524 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xhf5n" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.891658 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c2dd0e-da3a-40fa-b210-063c2c490fda-kube-api-access-pfjh7" (OuterVolumeSpecName: "kube-api-access-pfjh7") pod "48c2dd0e-da3a-40fa-b210-063c2c490fda" (UID: "48c2dd0e-da3a-40fa-b210-063c2c490fda"). InnerVolumeSpecName "kube-api-access-pfjh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.927519 4693 scope.go:117] "RemoveContainer" containerID="fe66ff248d57e9c38e3e81f6abd174820d973a906cbf3d22384e9656c28cb450" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.960387 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xhf5n" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.971659 4693 scope.go:117] "RemoveContainer" containerID="69b1d4d69a1d973eead9ea16ac3f8d338ecf640d5e7e15ac2390389d681446bc" Dec 04 11:14:37 crc kubenswrapper[4693]: E1204 11:14:37.972295 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69b1d4d69a1d973eead9ea16ac3f8d338ecf640d5e7e15ac2390389d681446bc\": container with ID starting with 69b1d4d69a1d973eead9ea16ac3f8d338ecf640d5e7e15ac2390389d681446bc not found: ID does not exist" containerID="69b1d4d69a1d973eead9ea16ac3f8d338ecf640d5e7e15ac2390389d681446bc" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.972352 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69b1d4d69a1d973eead9ea16ac3f8d338ecf640d5e7e15ac2390389d681446bc"} err="failed to get container status \"69b1d4d69a1d973eead9ea16ac3f8d338ecf640d5e7e15ac2390389d681446bc\": rpc error: code = NotFound desc = could not find container \"69b1d4d69a1d973eead9ea16ac3f8d338ecf640d5e7e15ac2390389d681446bc\": container with ID starting with 69b1d4d69a1d973eead9ea16ac3f8d338ecf640d5e7e15ac2390389d681446bc not found: ID does not exist" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.972388 4693 scope.go:117] "RemoveContainer" containerID="eebabc1807b4318eb43f338a7c605f3e7f608773f458e953793a6a9e2337fd00" Dec 04 11:14:37 crc kubenswrapper[4693]: E1204 11:14:37.972685 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eebabc1807b4318eb43f338a7c605f3e7f608773f458e953793a6a9e2337fd00\": container with ID starting with eebabc1807b4318eb43f338a7c605f3e7f608773f458e953793a6a9e2337fd00 not found: ID does not exist" containerID="eebabc1807b4318eb43f338a7c605f3e7f608773f458e953793a6a9e2337fd00" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.972709 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eebabc1807b4318eb43f338a7c605f3e7f608773f458e953793a6a9e2337fd00"} err="failed to get container status \"eebabc1807b4318eb43f338a7c605f3e7f608773f458e953793a6a9e2337fd00\": rpc error: code = NotFound desc = could not find container \"eebabc1807b4318eb43f338a7c605f3e7f608773f458e953793a6a9e2337fd00\": container with ID starting with eebabc1807b4318eb43f338a7c605f3e7f608773f458e953793a6a9e2337fd00 not found: ID does not exist" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.972728 4693 scope.go:117] "RemoveContainer" containerID="fe66ff248d57e9c38e3e81f6abd174820d973a906cbf3d22384e9656c28cb450" Dec 04 11:14:37 crc kubenswrapper[4693]: E1204 11:14:37.973051 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe66ff248d57e9c38e3e81f6abd174820d973a906cbf3d22384e9656c28cb450\": container with ID starting with fe66ff248d57e9c38e3e81f6abd174820d973a906cbf3d22384e9656c28cb450 not found: ID does not exist" containerID="fe66ff248d57e9c38e3e81f6abd174820d973a906cbf3d22384e9656c28cb450" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.973076 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe66ff248d57e9c38e3e81f6abd174820d973a906cbf3d22384e9656c28cb450"} err="failed to get container status \"fe66ff248d57e9c38e3e81f6abd174820d973a906cbf3d22384e9656c28cb450\": rpc error: code = NotFound desc = could not find container \"fe66ff248d57e9c38e3e81f6abd174820d973a906cbf3d22384e9656c28cb450\": container with ID starting with fe66ff248d57e9c38e3e81f6abd174820d973a906cbf3d22384e9656c28cb450 not found: ID does not exist" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.976893 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbbjp" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.982222 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfjh7\" (UniqueName: \"kubernetes.io/projected/48c2dd0e-da3a-40fa-b210-063c2c490fda-kube-api-access-pfjh7\") on node \"crc\" DevicePath \"\"" Dec 04 11:14:37 crc kubenswrapper[4693]: I1204 11:14:37.982278 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48c2dd0e-da3a-40fa-b210-063c2c490fda-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.016563 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c2dd0e-da3a-40fa-b210-063c2c490fda-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48c2dd0e-da3a-40fa-b210-063c2c490fda" (UID: "48c2dd0e-da3a-40fa-b210-063c2c490fda"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.083047 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d686s\" (UniqueName: \"kubernetes.io/projected/64a545b9-f2e5-4114-b554-7259dbe4d0a2-kube-api-access-d686s\") pod \"64a545b9-f2e5-4114-b554-7259dbe4d0a2\" (UID: \"64a545b9-f2e5-4114-b554-7259dbe4d0a2\") " Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.083100 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64a545b9-f2e5-4114-b554-7259dbe4d0a2-catalog-content\") pod \"64a545b9-f2e5-4114-b554-7259dbe4d0a2\" (UID: \"64a545b9-f2e5-4114-b554-7259dbe4d0a2\") " Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.083353 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64a545b9-f2e5-4114-b554-7259dbe4d0a2-utilities\") pod \"64a545b9-f2e5-4114-b554-7259dbe4d0a2\" (UID: \"64a545b9-f2e5-4114-b554-7259dbe4d0a2\") " Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.083753 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48c2dd0e-da3a-40fa-b210-063c2c490fda-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.084184 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64a545b9-f2e5-4114-b554-7259dbe4d0a2-utilities" (OuterVolumeSpecName: "utilities") pod "64a545b9-f2e5-4114-b554-7259dbe4d0a2" (UID: "64a545b9-f2e5-4114-b554-7259dbe4d0a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.086539 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64a545b9-f2e5-4114-b554-7259dbe4d0a2-kube-api-access-d686s" (OuterVolumeSpecName: "kube-api-access-d686s") pod "64a545b9-f2e5-4114-b554-7259dbe4d0a2" (UID: "64a545b9-f2e5-4114-b554-7259dbe4d0a2"). InnerVolumeSpecName "kube-api-access-d686s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.142165 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64a545b9-f2e5-4114-b554-7259dbe4d0a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64a545b9-f2e5-4114-b554-7259dbe4d0a2" (UID: "64a545b9-f2e5-4114-b554-7259dbe4d0a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.171114 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wpz8c"] Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.182705 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wpz8c"] Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.185930 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64a545b9-f2e5-4114-b554-7259dbe4d0a2-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.185977 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d686s\" (UniqueName: \"kubernetes.io/projected/64a545b9-f2e5-4114-b554-7259dbe4d0a2-kube-api-access-d686s\") on node \"crc\" DevicePath \"\"" Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.185989 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64a545b9-f2e5-4114-b554-7259dbe4d0a2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.471437 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48c2dd0e-da3a-40fa-b210-063c2c490fda" path="/var/lib/kubelet/pods/48c2dd0e-da3a-40fa-b210-063c2c490fda/volumes" Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.515349 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sf47l" Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.515597 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sf47l" Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.575289 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sf47l" Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.864072 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbbjp" Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.864620 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbbjp" event={"ID":"64a545b9-f2e5-4114-b554-7259dbe4d0a2","Type":"ContainerDied","Data":"079d51524d235ef363d1b79717d4d28bd67aa74b0ad4328e48d1f2d00613adf5"} Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.864659 4693 scope.go:117] "RemoveContainer" containerID="c2fb659defbb83e2e081ee6b17ddcfcfed8e1260db7007d9d2e668e4dbb1e890" Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.896096 4693 scope.go:117] "RemoveContainer" containerID="a04d1ed147f6119a313e6b362d8f75e71d144632f3a0ae37aba272cf49715774" Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.898712 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bbbjp"] Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.912948 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bbbjp"] Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.914924 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sf47l" Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.929985 4693 scope.go:117] "RemoveContainer" containerID="3eb54d024b7b9b2d1d665c29ba56e71c3e67b21b6d6baf2e5cf3a8ff082d75bf" Dec 04 11:14:38 crc kubenswrapper[4693]: I1204 11:14:38.945617 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xhf5n" Dec 04 11:14:39 crc kubenswrapper[4693]: I1204 11:14:39.544844 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sf47l"] Dec 04 11:14:40 crc kubenswrapper[4693]: I1204 11:14:40.474949 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64a545b9-f2e5-4114-b554-7259dbe4d0a2" path="/var/lib/kubelet/pods/64a545b9-f2e5-4114-b554-7259dbe4d0a2/volumes" Dec 04 11:14:40 crc kubenswrapper[4693]: I1204 11:14:40.883161 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sf47l" podUID="e0803114-c7c2-4f69-bf87-981f0ae84aff" containerName="registry-server" containerID="cri-o://71780c4797f7cdf970d4fd950d761fdf56dc23e4ae726eb7722b33be552a011c" gracePeriod=2 Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.363855 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sf47l" Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.453584 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0803114-c7c2-4f69-bf87-981f0ae84aff-catalog-content\") pod \"e0803114-c7c2-4f69-bf87-981f0ae84aff\" (UID: \"e0803114-c7c2-4f69-bf87-981f0ae84aff\") " Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.453932 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlqh6\" (UniqueName: \"kubernetes.io/projected/e0803114-c7c2-4f69-bf87-981f0ae84aff-kube-api-access-tlqh6\") pod \"e0803114-c7c2-4f69-bf87-981f0ae84aff\" (UID: \"e0803114-c7c2-4f69-bf87-981f0ae84aff\") " Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.454008 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0803114-c7c2-4f69-bf87-981f0ae84aff-utilities\") pod \"e0803114-c7c2-4f69-bf87-981f0ae84aff\" (UID: \"e0803114-c7c2-4f69-bf87-981f0ae84aff\") " Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.454898 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0803114-c7c2-4f69-bf87-981f0ae84aff-utilities" (OuterVolumeSpecName: "utilities") pod "e0803114-c7c2-4f69-bf87-981f0ae84aff" (UID: "e0803114-c7c2-4f69-bf87-981f0ae84aff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.459257 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0803114-c7c2-4f69-bf87-981f0ae84aff-kube-api-access-tlqh6" (OuterVolumeSpecName: "kube-api-access-tlqh6") pod "e0803114-c7c2-4f69-bf87-981f0ae84aff" (UID: "e0803114-c7c2-4f69-bf87-981f0ae84aff"). InnerVolumeSpecName "kube-api-access-tlqh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.472844 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0803114-c7c2-4f69-bf87-981f0ae84aff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0803114-c7c2-4f69-bf87-981f0ae84aff" (UID: "e0803114-c7c2-4f69-bf87-981f0ae84aff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.556981 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlqh6\" (UniqueName: \"kubernetes.io/projected/e0803114-c7c2-4f69-bf87-981f0ae84aff-kube-api-access-tlqh6\") on node \"crc\" DevicePath \"\"" Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.557024 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0803114-c7c2-4f69-bf87-981f0ae84aff-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.557033 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0803114-c7c2-4f69-bf87-981f0ae84aff-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.895800 4693 generic.go:334] "Generic (PLEG): container finished" podID="e0803114-c7c2-4f69-bf87-981f0ae84aff" containerID="71780c4797f7cdf970d4fd950d761fdf56dc23e4ae726eb7722b33be552a011c" exitCode=0 Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.895853 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sf47l" event={"ID":"e0803114-c7c2-4f69-bf87-981f0ae84aff","Type":"ContainerDied","Data":"71780c4797f7cdf970d4fd950d761fdf56dc23e4ae726eb7722b33be552a011c"} Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.895903 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sf47l" Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.895922 4693 scope.go:117] "RemoveContainer" containerID="71780c4797f7cdf970d4fd950d761fdf56dc23e4ae726eb7722b33be552a011c" Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.895903 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sf47l" event={"ID":"e0803114-c7c2-4f69-bf87-981f0ae84aff","Type":"ContainerDied","Data":"0e40e98454b490349edad0dce2b3804aef6ba42568c63192e69570cec1befc5b"} Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.923064 4693 scope.go:117] "RemoveContainer" containerID="3f823bccab2c8c5a26c312b4a7d22cd541e5fddd3ada44aa8b4e06e77133f3fd" Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.948503 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sf47l"] Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.951202 4693 scope.go:117] "RemoveContainer" containerID="abb6f2e482cb9f4da328223b2dc66622dc4f121c3af07200a0a99abcd548d0bd" Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.965600 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sf47l"] Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.974679 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xhf5n"] Dec 04 11:14:41 crc kubenswrapper[4693]: I1204 11:14:41.974944 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xhf5n" podUID="8f904038-2816-4047-a48a-2ad5bca195d1" containerName="registry-server" containerID="cri-o://0149853cbc8adf0ea71815603b788829820ea1e08119d1d6b4c95f6a91ee8974" gracePeriod=2 Dec 04 11:14:42 crc kubenswrapper[4693]: I1204 11:14:42.014983 4693 scope.go:117] "RemoveContainer" containerID="71780c4797f7cdf970d4fd950d761fdf56dc23e4ae726eb7722b33be552a011c" Dec 04 11:14:42 crc kubenswrapper[4693]: E1204 11:14:42.015721 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71780c4797f7cdf970d4fd950d761fdf56dc23e4ae726eb7722b33be552a011c\": container with ID starting with 71780c4797f7cdf970d4fd950d761fdf56dc23e4ae726eb7722b33be552a011c not found: ID does not exist" containerID="71780c4797f7cdf970d4fd950d761fdf56dc23e4ae726eb7722b33be552a011c" Dec 04 11:14:42 crc kubenswrapper[4693]: I1204 11:14:42.015764 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71780c4797f7cdf970d4fd950d761fdf56dc23e4ae726eb7722b33be552a011c"} err="failed to get container status \"71780c4797f7cdf970d4fd950d761fdf56dc23e4ae726eb7722b33be552a011c\": rpc error: code = NotFound desc = could not find container \"71780c4797f7cdf970d4fd950d761fdf56dc23e4ae726eb7722b33be552a011c\": container with ID starting with 71780c4797f7cdf970d4fd950d761fdf56dc23e4ae726eb7722b33be552a011c not found: ID does not exist" Dec 04 11:14:42 crc kubenswrapper[4693]: I1204 11:14:42.015791 4693 scope.go:117] "RemoveContainer" containerID="3f823bccab2c8c5a26c312b4a7d22cd541e5fddd3ada44aa8b4e06e77133f3fd" Dec 04 11:14:42 crc kubenswrapper[4693]: E1204 11:14:42.018030 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f823bccab2c8c5a26c312b4a7d22cd541e5fddd3ada44aa8b4e06e77133f3fd\": container with ID starting with 3f823bccab2c8c5a26c312b4a7d22cd541e5fddd3ada44aa8b4e06e77133f3fd not found: ID does not exist" containerID="3f823bccab2c8c5a26c312b4a7d22cd541e5fddd3ada44aa8b4e06e77133f3fd" Dec 04 11:14:42 crc kubenswrapper[4693]: I1204 11:14:42.018075 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f823bccab2c8c5a26c312b4a7d22cd541e5fddd3ada44aa8b4e06e77133f3fd"} err="failed to get container status \"3f823bccab2c8c5a26c312b4a7d22cd541e5fddd3ada44aa8b4e06e77133f3fd\": rpc error: code = NotFound desc = could not find container \"3f823bccab2c8c5a26c312b4a7d22cd541e5fddd3ada44aa8b4e06e77133f3fd\": container with ID starting with 3f823bccab2c8c5a26c312b4a7d22cd541e5fddd3ada44aa8b4e06e77133f3fd not found: ID does not exist" Dec 04 11:14:42 crc kubenswrapper[4693]: I1204 11:14:42.018101 4693 scope.go:117] "RemoveContainer" containerID="abb6f2e482cb9f4da328223b2dc66622dc4f121c3af07200a0a99abcd548d0bd" Dec 04 11:14:42 crc kubenswrapper[4693]: E1204 11:14:42.019854 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb6f2e482cb9f4da328223b2dc66622dc4f121c3af07200a0a99abcd548d0bd\": container with ID starting with abb6f2e482cb9f4da328223b2dc66622dc4f121c3af07200a0a99abcd548d0bd not found: ID does not exist" containerID="abb6f2e482cb9f4da328223b2dc66622dc4f121c3af07200a0a99abcd548d0bd" Dec 04 11:14:42 crc kubenswrapper[4693]: I1204 11:14:42.019884 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb6f2e482cb9f4da328223b2dc66622dc4f121c3af07200a0a99abcd548d0bd"} err="failed to get container status \"abb6f2e482cb9f4da328223b2dc66622dc4f121c3af07200a0a99abcd548d0bd\": rpc error: code = NotFound desc = could not find container \"abb6f2e482cb9f4da328223b2dc66622dc4f121c3af07200a0a99abcd548d0bd\": container with ID starting with abb6f2e482cb9f4da328223b2dc66622dc4f121c3af07200a0a99abcd548d0bd not found: ID does not exist" Dec 04 11:14:42 crc kubenswrapper[4693]: I1204 11:14:42.473151 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0803114-c7c2-4f69-bf87-981f0ae84aff" path="/var/lib/kubelet/pods/e0803114-c7c2-4f69-bf87-981f0ae84aff/volumes" Dec 04 11:14:42 crc kubenswrapper[4693]: I1204 11:14:42.909058 4693 generic.go:334] "Generic (PLEG): container finished" podID="8f904038-2816-4047-a48a-2ad5bca195d1" containerID="0149853cbc8adf0ea71815603b788829820ea1e08119d1d6b4c95f6a91ee8974" exitCode=0 Dec 04 11:14:42 crc kubenswrapper[4693]: I1204 11:14:42.909104 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhf5n" event={"ID":"8f904038-2816-4047-a48a-2ad5bca195d1","Type":"ContainerDied","Data":"0149853cbc8adf0ea71815603b788829820ea1e08119d1d6b4c95f6a91ee8974"} Dec 04 11:14:42 crc kubenswrapper[4693]: I1204 11:14:42.909129 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhf5n" event={"ID":"8f904038-2816-4047-a48a-2ad5bca195d1","Type":"ContainerDied","Data":"8e61a51a467257f933ce3c42a36266e01f3064b7068c60ccf983b2e2bda8c52d"} Dec 04 11:14:42 crc kubenswrapper[4693]: I1204 11:14:42.909143 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e61a51a467257f933ce3c42a36266e01f3064b7068c60ccf983b2e2bda8c52d" Dec 04 11:14:42 crc kubenswrapper[4693]: I1204 11:14:42.944080 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhf5n" Dec 04 11:14:43 crc kubenswrapper[4693]: I1204 11:14:43.088143 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhp8k\" (UniqueName: \"kubernetes.io/projected/8f904038-2816-4047-a48a-2ad5bca195d1-kube-api-access-rhp8k\") pod \"8f904038-2816-4047-a48a-2ad5bca195d1\" (UID: \"8f904038-2816-4047-a48a-2ad5bca195d1\") " Dec 04 11:14:43 crc kubenswrapper[4693]: I1204 11:14:43.088425 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f904038-2816-4047-a48a-2ad5bca195d1-catalog-content\") pod \"8f904038-2816-4047-a48a-2ad5bca195d1\" (UID: \"8f904038-2816-4047-a48a-2ad5bca195d1\") " Dec 04 11:14:43 crc kubenswrapper[4693]: I1204 11:14:43.088547 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f904038-2816-4047-a48a-2ad5bca195d1-utilities\") pod \"8f904038-2816-4047-a48a-2ad5bca195d1\" (UID: \"8f904038-2816-4047-a48a-2ad5bca195d1\") " Dec 04 11:14:43 crc kubenswrapper[4693]: I1204 11:14:43.089773 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f904038-2816-4047-a48a-2ad5bca195d1-utilities" (OuterVolumeSpecName: "utilities") pod "8f904038-2816-4047-a48a-2ad5bca195d1" (UID: "8f904038-2816-4047-a48a-2ad5bca195d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:14:43 crc kubenswrapper[4693]: I1204 11:14:43.095109 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f904038-2816-4047-a48a-2ad5bca195d1-kube-api-access-rhp8k" (OuterVolumeSpecName: "kube-api-access-rhp8k") pod "8f904038-2816-4047-a48a-2ad5bca195d1" (UID: "8f904038-2816-4047-a48a-2ad5bca195d1"). InnerVolumeSpecName "kube-api-access-rhp8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:14:43 crc kubenswrapper[4693]: I1204 11:14:43.144307 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f904038-2816-4047-a48a-2ad5bca195d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f904038-2816-4047-a48a-2ad5bca195d1" (UID: "8f904038-2816-4047-a48a-2ad5bca195d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:14:43 crc kubenswrapper[4693]: I1204 11:14:43.191026 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f904038-2816-4047-a48a-2ad5bca195d1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:14:43 crc kubenswrapper[4693]: I1204 11:14:43.191063 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f904038-2816-4047-a48a-2ad5bca195d1-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:14:43 crc kubenswrapper[4693]: I1204 11:14:43.191074 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhp8k\" (UniqueName: \"kubernetes.io/projected/8f904038-2816-4047-a48a-2ad5bca195d1-kube-api-access-rhp8k\") on node \"crc\" DevicePath \"\"" Dec 04 11:14:43 crc kubenswrapper[4693]: I1204 11:14:43.916750 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhf5n" Dec 04 11:14:43 crc kubenswrapper[4693]: I1204 11:14:43.951621 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xhf5n"] Dec 04 11:14:43 crc kubenswrapper[4693]: I1204 11:14:43.962079 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xhf5n"] Dec 04 11:14:44 crc kubenswrapper[4693]: I1204 11:14:44.472092 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f904038-2816-4047-a48a-2ad5bca195d1" path="/var/lib/kubelet/pods/8f904038-2816-4047-a48a-2ad5bca195d1/volumes" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.147137 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414115-4n6xk"] Dec 04 11:15:00 crc kubenswrapper[4693]: E1204 11:15:00.148042 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f904038-2816-4047-a48a-2ad5bca195d1" containerName="registry-server" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.148057 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f904038-2816-4047-a48a-2ad5bca195d1" containerName="registry-server" Dec 04 11:15:00 crc kubenswrapper[4693]: E1204 11:15:00.148072 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0803114-c7c2-4f69-bf87-981f0ae84aff" containerName="registry-server" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.148079 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0803114-c7c2-4f69-bf87-981f0ae84aff" containerName="registry-server" Dec 04 11:15:00 crc kubenswrapper[4693]: E1204 11:15:00.148101 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f904038-2816-4047-a48a-2ad5bca195d1" containerName="extract-content" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.148107 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f904038-2816-4047-a48a-2ad5bca195d1" containerName="extract-content" Dec 04 11:15:00 crc kubenswrapper[4693]: E1204 11:15:00.148119 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f904038-2816-4047-a48a-2ad5bca195d1" containerName="extract-utilities" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.148124 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f904038-2816-4047-a48a-2ad5bca195d1" containerName="extract-utilities" Dec 04 11:15:00 crc kubenswrapper[4693]: E1204 11:15:00.148134 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c2dd0e-da3a-40fa-b210-063c2c490fda" containerName="extract-utilities" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.148141 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c2dd0e-da3a-40fa-b210-063c2c490fda" containerName="extract-utilities" Dec 04 11:15:00 crc kubenswrapper[4693]: E1204 11:15:00.148152 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a545b9-f2e5-4114-b554-7259dbe4d0a2" containerName="extract-content" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.148157 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a545b9-f2e5-4114-b554-7259dbe4d0a2" containerName="extract-content" Dec 04 11:15:00 crc kubenswrapper[4693]: E1204 11:15:00.148166 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a545b9-f2e5-4114-b554-7259dbe4d0a2" containerName="extract-utilities" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.148171 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a545b9-f2e5-4114-b554-7259dbe4d0a2" containerName="extract-utilities" Dec 04 11:15:00 crc kubenswrapper[4693]: E1204 11:15:00.148184 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c2dd0e-da3a-40fa-b210-063c2c490fda" containerName="registry-server" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.148191 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c2dd0e-da3a-40fa-b210-063c2c490fda" containerName="registry-server" Dec 04 11:15:00 crc kubenswrapper[4693]: E1204 11:15:00.148203 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0803114-c7c2-4f69-bf87-981f0ae84aff" containerName="extract-content" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.148208 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0803114-c7c2-4f69-bf87-981f0ae84aff" containerName="extract-content" Dec 04 11:15:00 crc kubenswrapper[4693]: E1204 11:15:00.148230 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a545b9-f2e5-4114-b554-7259dbe4d0a2" containerName="registry-server" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.148236 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a545b9-f2e5-4114-b554-7259dbe4d0a2" containerName="registry-server" Dec 04 11:15:00 crc kubenswrapper[4693]: E1204 11:15:00.148244 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c2dd0e-da3a-40fa-b210-063c2c490fda" containerName="extract-content" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.148250 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c2dd0e-da3a-40fa-b210-063c2c490fda" containerName="extract-content" Dec 04 11:15:00 crc kubenswrapper[4693]: E1204 11:15:00.148266 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0803114-c7c2-4f69-bf87-981f0ae84aff" containerName="extract-utilities" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.148271 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0803114-c7c2-4f69-bf87-981f0ae84aff" containerName="extract-utilities" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.148466 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f904038-2816-4047-a48a-2ad5bca195d1" containerName="registry-server" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.148488 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="64a545b9-f2e5-4114-b554-7259dbe4d0a2" containerName="registry-server" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.148500 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0803114-c7c2-4f69-bf87-981f0ae84aff" containerName="registry-server" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.148512 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c2dd0e-da3a-40fa-b210-063c2c490fda" containerName="registry-server" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.149124 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-4n6xk" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.154415 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.154511 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.168612 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414115-4n6xk"] Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.223719 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a856208-53e8-497e-a0f8-901f4977b416-secret-volume\") pod \"collect-profiles-29414115-4n6xk\" (UID: \"4a856208-53e8-497e-a0f8-901f4977b416\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-4n6xk" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.223781 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fws58\" (UniqueName: \"kubernetes.io/projected/4a856208-53e8-497e-a0f8-901f4977b416-kube-api-access-fws58\") pod \"collect-profiles-29414115-4n6xk\" (UID: \"4a856208-53e8-497e-a0f8-901f4977b416\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-4n6xk" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.223843 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a856208-53e8-497e-a0f8-901f4977b416-config-volume\") pod \"collect-profiles-29414115-4n6xk\" (UID: \"4a856208-53e8-497e-a0f8-901f4977b416\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-4n6xk" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.325720 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a856208-53e8-497e-a0f8-901f4977b416-config-volume\") pod \"collect-profiles-29414115-4n6xk\" (UID: \"4a856208-53e8-497e-a0f8-901f4977b416\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-4n6xk" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.325908 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a856208-53e8-497e-a0f8-901f4977b416-secret-volume\") pod \"collect-profiles-29414115-4n6xk\" (UID: \"4a856208-53e8-497e-a0f8-901f4977b416\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-4n6xk" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.325949 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fws58\" (UniqueName: \"kubernetes.io/projected/4a856208-53e8-497e-a0f8-901f4977b416-kube-api-access-fws58\") pod \"collect-profiles-29414115-4n6xk\" (UID: \"4a856208-53e8-497e-a0f8-901f4977b416\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-4n6xk" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.327112 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a856208-53e8-497e-a0f8-901f4977b416-config-volume\") pod \"collect-profiles-29414115-4n6xk\" (UID: \"4a856208-53e8-497e-a0f8-901f4977b416\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-4n6xk" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.338248 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a856208-53e8-497e-a0f8-901f4977b416-secret-volume\") pod \"collect-profiles-29414115-4n6xk\" (UID: \"4a856208-53e8-497e-a0f8-901f4977b416\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-4n6xk" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.345520 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fws58\" (UniqueName: \"kubernetes.io/projected/4a856208-53e8-497e-a0f8-901f4977b416-kube-api-access-fws58\") pod \"collect-profiles-29414115-4n6xk\" (UID: \"4a856208-53e8-497e-a0f8-901f4977b416\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-4n6xk" Dec 04 11:15:00 crc kubenswrapper[4693]: I1204 11:15:00.524842 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-4n6xk" Dec 04 11:15:01 crc kubenswrapper[4693]: I1204 11:15:01.063384 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414115-4n6xk"] Dec 04 11:15:02 crc kubenswrapper[4693]: I1204 11:15:02.124706 4693 generic.go:334] "Generic (PLEG): container finished" podID="4a856208-53e8-497e-a0f8-901f4977b416" containerID="49e7e213677adfe5b7240a42e334b39a79ccda26bc29eb14f18f4670c08c5e0a" exitCode=0 Dec 04 11:15:02 crc kubenswrapper[4693]: I1204 11:15:02.124790 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-4n6xk" event={"ID":"4a856208-53e8-497e-a0f8-901f4977b416","Type":"ContainerDied","Data":"49e7e213677adfe5b7240a42e334b39a79ccda26bc29eb14f18f4670c08c5e0a"} Dec 04 11:15:02 crc kubenswrapper[4693]: I1204 11:15:02.125024 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-4n6xk" event={"ID":"4a856208-53e8-497e-a0f8-901f4977b416","Type":"ContainerStarted","Data":"799590a5fc4db4de9ab0982c38fe2760009da0d2378ede7efda06033469a803a"} Dec 04 11:15:03 crc kubenswrapper[4693]: I1204 11:15:03.542516 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-4n6xk" Dec 04 11:15:03 crc kubenswrapper[4693]: I1204 11:15:03.696999 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fws58\" (UniqueName: \"kubernetes.io/projected/4a856208-53e8-497e-a0f8-901f4977b416-kube-api-access-fws58\") pod \"4a856208-53e8-497e-a0f8-901f4977b416\" (UID: \"4a856208-53e8-497e-a0f8-901f4977b416\") " Dec 04 11:15:03 crc kubenswrapper[4693]: I1204 11:15:03.697115 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a856208-53e8-497e-a0f8-901f4977b416-config-volume\") pod \"4a856208-53e8-497e-a0f8-901f4977b416\" (UID: \"4a856208-53e8-497e-a0f8-901f4977b416\") " Dec 04 11:15:03 crc kubenswrapper[4693]: I1204 11:15:03.697228 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a856208-53e8-497e-a0f8-901f4977b416-secret-volume\") pod \"4a856208-53e8-497e-a0f8-901f4977b416\" (UID: \"4a856208-53e8-497e-a0f8-901f4977b416\") " Dec 04 11:15:03 crc kubenswrapper[4693]: I1204 11:15:03.697925 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a856208-53e8-497e-a0f8-901f4977b416-config-volume" (OuterVolumeSpecName: "config-volume") pod "4a856208-53e8-497e-a0f8-901f4977b416" (UID: "4a856208-53e8-497e-a0f8-901f4977b416"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 11:15:03 crc kubenswrapper[4693]: I1204 11:15:03.702476 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a856208-53e8-497e-a0f8-901f4977b416-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4a856208-53e8-497e-a0f8-901f4977b416" (UID: "4a856208-53e8-497e-a0f8-901f4977b416"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:15:03 crc kubenswrapper[4693]: I1204 11:15:03.717556 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a856208-53e8-497e-a0f8-901f4977b416-kube-api-access-fws58" (OuterVolumeSpecName: "kube-api-access-fws58") pod "4a856208-53e8-497e-a0f8-901f4977b416" (UID: "4a856208-53e8-497e-a0f8-901f4977b416"). InnerVolumeSpecName "kube-api-access-fws58". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:15:03 crc kubenswrapper[4693]: I1204 11:15:03.799999 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fws58\" (UniqueName: \"kubernetes.io/projected/4a856208-53e8-497e-a0f8-901f4977b416-kube-api-access-fws58\") on node \"crc\" DevicePath \"\"" Dec 04 11:15:03 crc kubenswrapper[4693]: I1204 11:15:03.800039 4693 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a856208-53e8-497e-a0f8-901f4977b416-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 11:15:03 crc kubenswrapper[4693]: I1204 11:15:03.800051 4693 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a856208-53e8-497e-a0f8-901f4977b416-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 11:15:04 crc kubenswrapper[4693]: I1204 11:15:04.146128 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-4n6xk" event={"ID":"4a856208-53e8-497e-a0f8-901f4977b416","Type":"ContainerDied","Data":"799590a5fc4db4de9ab0982c38fe2760009da0d2378ede7efda06033469a803a"} Dec 04 11:15:04 crc kubenswrapper[4693]: I1204 11:15:04.146181 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="799590a5fc4db4de9ab0982c38fe2760009da0d2378ede7efda06033469a803a" Dec 04 11:15:04 crc kubenswrapper[4693]: I1204 11:15:04.146186 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414115-4n6xk" Dec 04 11:15:04 crc kubenswrapper[4693]: I1204 11:15:04.634873 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx"] Dec 04 11:15:04 crc kubenswrapper[4693]: I1204 11:15:04.644045 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414070-b79nx"] Dec 04 11:15:06 crc kubenswrapper[4693]: I1204 11:15:06.473021 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8658c582-4e18-4b63-ae78-81da959895b5" path="/var/lib/kubelet/pods/8658c582-4e18-4b63-ae78-81da959895b5/volumes" Dec 04 11:15:12 crc kubenswrapper[4693]: I1204 11:15:12.415565 4693 scope.go:117] "RemoveContainer" containerID="56b13829980dacea53c54829f0b24b8652b4da321c853ca0c347efbf0b3e7ee6" Dec 04 11:16:52 crc kubenswrapper[4693]: I1204 11:16:52.272641 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:16:52 crc kubenswrapper[4693]: I1204 11:16:52.273075 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:16:56 crc kubenswrapper[4693]: I1204 11:16:56.184716 4693 generic.go:334] "Generic (PLEG): container finished" podID="ef96b9a9-00d5-410f-a233-c4df68302d90" containerID="cf93b52f8706c2d851e302cfcf7ceae09c281b25ecab84c9716b3a1e38848696" exitCode=0 Dec 04 11:16:56 crc kubenswrapper[4693]: I1204 11:16:56.184818 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zljj5/must-gather-8jmx2" event={"ID":"ef96b9a9-00d5-410f-a233-c4df68302d90","Type":"ContainerDied","Data":"cf93b52f8706c2d851e302cfcf7ceae09c281b25ecab84c9716b3a1e38848696"} Dec 04 11:16:56 crc kubenswrapper[4693]: I1204 11:16:56.186205 4693 scope.go:117] "RemoveContainer" containerID="cf93b52f8706c2d851e302cfcf7ceae09c281b25ecab84c9716b3a1e38848696" Dec 04 11:16:57 crc kubenswrapper[4693]: I1204 11:16:57.272742 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zljj5_must-gather-8jmx2_ef96b9a9-00d5-410f-a233-c4df68302d90/gather/0.log" Dec 04 11:17:06 crc kubenswrapper[4693]: I1204 11:17:06.658411 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zljj5/must-gather-8jmx2"] Dec 04 11:17:06 crc kubenswrapper[4693]: I1204 11:17:06.659242 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-zljj5/must-gather-8jmx2" podUID="ef96b9a9-00d5-410f-a233-c4df68302d90" containerName="copy" containerID="cri-o://b91533a9e6f4e4172ac4363b1c60c0d495cafd811e2d7a19867d53a678edbbcb" gracePeriod=2 Dec 04 11:17:06 crc kubenswrapper[4693]: I1204 11:17:06.666280 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zljj5/must-gather-8jmx2"] Dec 04 11:17:07 crc kubenswrapper[4693]: I1204 11:17:07.283879 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zljj5_must-gather-8jmx2_ef96b9a9-00d5-410f-a233-c4df68302d90/copy/0.log" Dec 04 11:17:07 crc kubenswrapper[4693]: I1204 11:17:07.284840 4693 generic.go:334] "Generic (PLEG): container finished" podID="ef96b9a9-00d5-410f-a233-c4df68302d90" containerID="b91533a9e6f4e4172ac4363b1c60c0d495cafd811e2d7a19867d53a678edbbcb" exitCode=143 Dec 04 11:17:07 crc kubenswrapper[4693]: I1204 11:17:07.649433 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zljj5_must-gather-8jmx2_ef96b9a9-00d5-410f-a233-c4df68302d90/copy/0.log" Dec 04 11:17:07 crc kubenswrapper[4693]: I1204 11:17:07.650373 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zljj5/must-gather-8jmx2" Dec 04 11:17:07 crc kubenswrapper[4693]: I1204 11:17:07.733236 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ef96b9a9-00d5-410f-a233-c4df68302d90-must-gather-output\") pod \"ef96b9a9-00d5-410f-a233-c4df68302d90\" (UID: \"ef96b9a9-00d5-410f-a233-c4df68302d90\") " Dec 04 11:17:07 crc kubenswrapper[4693]: I1204 11:17:07.733519 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwksz\" (UniqueName: \"kubernetes.io/projected/ef96b9a9-00d5-410f-a233-c4df68302d90-kube-api-access-nwksz\") pod \"ef96b9a9-00d5-410f-a233-c4df68302d90\" (UID: \"ef96b9a9-00d5-410f-a233-c4df68302d90\") " Dec 04 11:17:07 crc kubenswrapper[4693]: I1204 11:17:07.747746 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef96b9a9-00d5-410f-a233-c4df68302d90-kube-api-access-nwksz" (OuterVolumeSpecName: "kube-api-access-nwksz") pod "ef96b9a9-00d5-410f-a233-c4df68302d90" (UID: "ef96b9a9-00d5-410f-a233-c4df68302d90"). InnerVolumeSpecName "kube-api-access-nwksz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:17:07 crc kubenswrapper[4693]: I1204 11:17:07.835484 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwksz\" (UniqueName: \"kubernetes.io/projected/ef96b9a9-00d5-410f-a233-c4df68302d90-kube-api-access-nwksz\") on node \"crc\" DevicePath \"\"" Dec 04 11:17:07 crc kubenswrapper[4693]: I1204 11:17:07.919415 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef96b9a9-00d5-410f-a233-c4df68302d90-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ef96b9a9-00d5-410f-a233-c4df68302d90" (UID: "ef96b9a9-00d5-410f-a233-c4df68302d90"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:17:07 crc kubenswrapper[4693]: I1204 11:17:07.936928 4693 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ef96b9a9-00d5-410f-a233-c4df68302d90-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 04 11:17:08 crc kubenswrapper[4693]: I1204 11:17:08.304729 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zljj5_must-gather-8jmx2_ef96b9a9-00d5-410f-a233-c4df68302d90/copy/0.log" Dec 04 11:17:08 crc kubenswrapper[4693]: I1204 11:17:08.305174 4693 scope.go:117] "RemoveContainer" containerID="b91533a9e6f4e4172ac4363b1c60c0d495cafd811e2d7a19867d53a678edbbcb" Dec 04 11:17:08 crc kubenswrapper[4693]: I1204 11:17:08.305221 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zljj5/must-gather-8jmx2" Dec 04 11:17:08 crc kubenswrapper[4693]: I1204 11:17:08.337418 4693 scope.go:117] "RemoveContainer" containerID="cf93b52f8706c2d851e302cfcf7ceae09c281b25ecab84c9716b3a1e38848696" Dec 04 11:17:08 crc kubenswrapper[4693]: I1204 11:17:08.472957 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef96b9a9-00d5-410f-a233-c4df68302d90" path="/var/lib/kubelet/pods/ef96b9a9-00d5-410f-a233-c4df68302d90/volumes" Dec 04 11:17:22 crc kubenswrapper[4693]: I1204 11:17:22.273258 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:17:22 crc kubenswrapper[4693]: I1204 11:17:22.273849 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:17:52 crc kubenswrapper[4693]: I1204 11:17:52.273032 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:17:52 crc kubenswrapper[4693]: I1204 11:17:52.273628 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:17:52 crc kubenswrapper[4693]: I1204 11:17:52.273678 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 11:17:52 crc kubenswrapper[4693]: I1204 11:17:52.274475 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37f5b4da1eac4bd8530c08f10e1ab470ebfac38f4d7d7da5524b314f5c190a60"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 11:17:52 crc kubenswrapper[4693]: I1204 11:17:52.274544 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://37f5b4da1eac4bd8530c08f10e1ab470ebfac38f4d7d7da5524b314f5c190a60" gracePeriod=600 Dec 04 11:17:53 crc kubenswrapper[4693]: I1204 11:17:53.133445 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="37f5b4da1eac4bd8530c08f10e1ab470ebfac38f4d7d7da5524b314f5c190a60" exitCode=0 Dec 04 11:17:53 crc kubenswrapper[4693]: I1204 11:17:53.133498 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"37f5b4da1eac4bd8530c08f10e1ab470ebfac38f4d7d7da5524b314f5c190a60"} Dec 04 11:17:53 crc kubenswrapper[4693]: I1204 11:17:53.134027 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5"} Dec 04 11:17:53 crc kubenswrapper[4693]: I1204 11:17:53.134052 4693 scope.go:117] "RemoveContainer" containerID="4bccc21c25c93b11056c1e0a286a463d3714ee3ba53b48448999d3afe2406836" Dec 04 11:18:12 crc kubenswrapper[4693]: I1204 11:18:12.614699 4693 scope.go:117] "RemoveContainer" containerID="d9430d8b6ade17028afabc35feb097d4f94fc94e78a81f4a27f37ae7978b5ab5" Dec 04 11:18:12 crc kubenswrapper[4693]: I1204 11:18:12.638085 4693 scope.go:117] "RemoveContainer" containerID="d80c13474cd515121c4756c722872ad5d43b871ad08615aa2a543405522317ce" Dec 04 11:19:52 crc kubenswrapper[4693]: I1204 11:19:52.273408 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:19:52 crc kubenswrapper[4693]: I1204 11:19:52.273912 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:20:12 crc kubenswrapper[4693]: I1204 11:20:12.523753 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9fkp7/must-gather-km9fd"] Dec 04 11:20:12 crc kubenswrapper[4693]: E1204 11:20:12.524733 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef96b9a9-00d5-410f-a233-c4df68302d90" containerName="gather" Dec 04 11:20:12 crc kubenswrapper[4693]: I1204 11:20:12.524753 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef96b9a9-00d5-410f-a233-c4df68302d90" containerName="gather" Dec 04 11:20:12 crc kubenswrapper[4693]: E1204 11:20:12.524789 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a856208-53e8-497e-a0f8-901f4977b416" containerName="collect-profiles" Dec 04 11:20:12 crc kubenswrapper[4693]: I1204 11:20:12.524797 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a856208-53e8-497e-a0f8-901f4977b416" containerName="collect-profiles" Dec 04 11:20:12 crc kubenswrapper[4693]: E1204 11:20:12.524810 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef96b9a9-00d5-410f-a233-c4df68302d90" containerName="copy" Dec 04 11:20:12 crc kubenswrapper[4693]: I1204 11:20:12.524819 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef96b9a9-00d5-410f-a233-c4df68302d90" containerName="copy" Dec 04 11:20:12 crc kubenswrapper[4693]: I1204 11:20:12.525098 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef96b9a9-00d5-410f-a233-c4df68302d90" containerName="copy" Dec 04 11:20:12 crc kubenswrapper[4693]: I1204 11:20:12.525125 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a856208-53e8-497e-a0f8-901f4977b416" containerName="collect-profiles" Dec 04 11:20:12 crc kubenswrapper[4693]: I1204 11:20:12.525137 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef96b9a9-00d5-410f-a233-c4df68302d90" containerName="gather" Dec 04 11:20:12 crc kubenswrapper[4693]: I1204 11:20:12.526230 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fkp7/must-gather-km9fd" Dec 04 11:20:12 crc kubenswrapper[4693]: I1204 11:20:12.527904 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9fkp7"/"openshift-service-ca.crt" Dec 04 11:20:12 crc kubenswrapper[4693]: I1204 11:20:12.533236 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9fkp7"/"kube-root-ca.crt" Dec 04 11:20:12 crc kubenswrapper[4693]: I1204 11:20:12.542716 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/96fe5a48-f46f-4397-973d-87b60628ba4b-must-gather-output\") pod \"must-gather-km9fd\" (UID: \"96fe5a48-f46f-4397-973d-87b60628ba4b\") " pod="openshift-must-gather-9fkp7/must-gather-km9fd" Dec 04 11:20:12 crc kubenswrapper[4693]: I1204 11:20:12.543000 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq8mj\" (UniqueName: \"kubernetes.io/projected/96fe5a48-f46f-4397-973d-87b60628ba4b-kube-api-access-zq8mj\") pod \"must-gather-km9fd\" (UID: \"96fe5a48-f46f-4397-973d-87b60628ba4b\") " pod="openshift-must-gather-9fkp7/must-gather-km9fd" Dec 04 11:20:12 crc kubenswrapper[4693]: I1204 11:20:12.555534 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9fkp7/must-gather-km9fd"] Dec 04 11:20:12 crc kubenswrapper[4693]: I1204 11:20:12.644494 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq8mj\" (UniqueName: \"kubernetes.io/projected/96fe5a48-f46f-4397-973d-87b60628ba4b-kube-api-access-zq8mj\") pod \"must-gather-km9fd\" (UID: \"96fe5a48-f46f-4397-973d-87b60628ba4b\") " pod="openshift-must-gather-9fkp7/must-gather-km9fd" Dec 04 11:20:12 crc kubenswrapper[4693]: I1204 11:20:12.644575 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/96fe5a48-f46f-4397-973d-87b60628ba4b-must-gather-output\") pod \"must-gather-km9fd\" (UID: \"96fe5a48-f46f-4397-973d-87b60628ba4b\") " pod="openshift-must-gather-9fkp7/must-gather-km9fd" Dec 04 11:20:12 crc kubenswrapper[4693]: I1204 11:20:12.645089 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/96fe5a48-f46f-4397-973d-87b60628ba4b-must-gather-output\") pod \"must-gather-km9fd\" (UID: \"96fe5a48-f46f-4397-973d-87b60628ba4b\") " pod="openshift-must-gather-9fkp7/must-gather-km9fd" Dec 04 11:20:12 crc kubenswrapper[4693]: I1204 11:20:12.673827 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq8mj\" (UniqueName: \"kubernetes.io/projected/96fe5a48-f46f-4397-973d-87b60628ba4b-kube-api-access-zq8mj\") pod \"must-gather-km9fd\" (UID: \"96fe5a48-f46f-4397-973d-87b60628ba4b\") " pod="openshift-must-gather-9fkp7/must-gather-km9fd" Dec 04 11:20:12 crc kubenswrapper[4693]: I1204 11:20:12.851486 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fkp7/must-gather-km9fd" Dec 04 11:20:13 crc kubenswrapper[4693]: I1204 11:20:13.317786 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9fkp7/must-gather-km9fd"] Dec 04 11:20:13 crc kubenswrapper[4693]: I1204 11:20:13.493613 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fkp7/must-gather-km9fd" event={"ID":"96fe5a48-f46f-4397-973d-87b60628ba4b","Type":"ContainerStarted","Data":"7690470bb998b7d6a7dcb714a155a453d0c2e5ce1f189e9b996c2e4bfb470dc8"} Dec 04 11:20:15 crc kubenswrapper[4693]: I1204 11:20:15.512611 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fkp7/must-gather-km9fd" event={"ID":"96fe5a48-f46f-4397-973d-87b60628ba4b","Type":"ContainerStarted","Data":"0e4f387a916af6cfa46c9e0f801f517f0450e99cf08de41bcf48df40600940be"} Dec 04 11:20:18 crc kubenswrapper[4693]: I1204 11:20:18.547822 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fkp7/must-gather-km9fd" event={"ID":"96fe5a48-f46f-4397-973d-87b60628ba4b","Type":"ContainerStarted","Data":"a69a7486191de38c5e95a003a56205209dc33839186934e012194a59e125b244"} Dec 04 11:20:18 crc kubenswrapper[4693]: I1204 11:20:18.909901 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9fkp7/crc-debug-v875n"] Dec 04 11:20:18 crc kubenswrapper[4693]: I1204 11:20:18.911710 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fkp7/crc-debug-v875n" Dec 04 11:20:18 crc kubenswrapper[4693]: I1204 11:20:18.914905 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9fkp7"/"default-dockercfg-t5zhp" Dec 04 11:20:19 crc kubenswrapper[4693]: I1204 11:20:19.013858 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfcvh\" (UniqueName: \"kubernetes.io/projected/b3748e9d-952c-4b8a-adaf-0b4d226813a1-kube-api-access-cfcvh\") pod \"crc-debug-v875n\" (UID: \"b3748e9d-952c-4b8a-adaf-0b4d226813a1\") " pod="openshift-must-gather-9fkp7/crc-debug-v875n" Dec 04 11:20:19 crc kubenswrapper[4693]: I1204 11:20:19.013989 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3748e9d-952c-4b8a-adaf-0b4d226813a1-host\") pod \"crc-debug-v875n\" (UID: \"b3748e9d-952c-4b8a-adaf-0b4d226813a1\") " pod="openshift-must-gather-9fkp7/crc-debug-v875n" Dec 04 11:20:19 crc kubenswrapper[4693]: I1204 11:20:19.115579 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3748e9d-952c-4b8a-adaf-0b4d226813a1-host\") pod \"crc-debug-v875n\" (UID: \"b3748e9d-952c-4b8a-adaf-0b4d226813a1\") " pod="openshift-must-gather-9fkp7/crc-debug-v875n" Dec 04 11:20:19 crc kubenswrapper[4693]: I1204 11:20:19.115722 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3748e9d-952c-4b8a-adaf-0b4d226813a1-host\") pod \"crc-debug-v875n\" (UID: \"b3748e9d-952c-4b8a-adaf-0b4d226813a1\") " pod="openshift-must-gather-9fkp7/crc-debug-v875n" Dec 04 11:20:19 crc kubenswrapper[4693]: I1204 11:20:19.115764 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfcvh\" (UniqueName: \"kubernetes.io/projected/b3748e9d-952c-4b8a-adaf-0b4d226813a1-kube-api-access-cfcvh\") pod \"crc-debug-v875n\" (UID: \"b3748e9d-952c-4b8a-adaf-0b4d226813a1\") " pod="openshift-must-gather-9fkp7/crc-debug-v875n" Dec 04 11:20:19 crc kubenswrapper[4693]: I1204 11:20:19.138844 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfcvh\" (UniqueName: \"kubernetes.io/projected/b3748e9d-952c-4b8a-adaf-0b4d226813a1-kube-api-access-cfcvh\") pod \"crc-debug-v875n\" (UID: \"b3748e9d-952c-4b8a-adaf-0b4d226813a1\") " pod="openshift-must-gather-9fkp7/crc-debug-v875n" Dec 04 11:20:19 crc kubenswrapper[4693]: I1204 11:20:19.230878 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fkp7/crc-debug-v875n" Dec 04 11:20:19 crc kubenswrapper[4693]: I1204 11:20:19.557901 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fkp7/crc-debug-v875n" event={"ID":"b3748e9d-952c-4b8a-adaf-0b4d226813a1","Type":"ContainerStarted","Data":"c8abec41f3fb3de6f5dc96863a6e14db7d4da6becac3074182dfdc6d0eeeb137"} Dec 04 11:20:19 crc kubenswrapper[4693]: I1204 11:20:19.574974 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9fkp7/must-gather-km9fd" podStartSLOduration=7.574949721 podStartE2EDuration="7.574949721s" podCreationTimestamp="2025-12-04 11:20:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 11:20:19.569178277 +0000 UTC m=+5865.466772030" watchObservedRunningTime="2025-12-04 11:20:19.574949721 +0000 UTC m=+5865.472543474" Dec 04 11:20:20 crc kubenswrapper[4693]: I1204 11:20:20.566897 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fkp7/crc-debug-v875n" event={"ID":"b3748e9d-952c-4b8a-adaf-0b4d226813a1","Type":"ContainerStarted","Data":"52a9f1f6c66ecb0f1aff1cd77ea7223a217922b350097760ae11166c815ac446"} Dec 04 11:20:20 crc kubenswrapper[4693]: I1204 11:20:20.594092 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9fkp7/crc-debug-v875n" podStartSLOduration=2.594068596 podStartE2EDuration="2.594068596s" podCreationTimestamp="2025-12-04 11:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 11:20:20.582979679 +0000 UTC m=+5866.480573432" watchObservedRunningTime="2025-12-04 11:20:20.594068596 +0000 UTC m=+5866.491662349" Dec 04 11:20:22 crc kubenswrapper[4693]: I1204 11:20:22.274072 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:20:22 crc kubenswrapper[4693]: I1204 11:20:22.274708 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:20:52 crc kubenswrapper[4693]: I1204 11:20:52.273116 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:20:52 crc kubenswrapper[4693]: I1204 11:20:52.275162 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:20:52 crc kubenswrapper[4693]: I1204 11:20:52.275379 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 11:20:52 crc kubenswrapper[4693]: I1204 11:20:52.276516 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 11:20:52 crc kubenswrapper[4693]: I1204 11:20:52.276653 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" gracePeriod=600 Dec 04 11:20:52 crc kubenswrapper[4693]: E1204 11:20:52.417831 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:20:52 crc kubenswrapper[4693]: I1204 11:20:52.861802 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" exitCode=0 Dec 04 11:20:52 crc kubenswrapper[4693]: I1204 11:20:52.861854 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5"} Dec 04 11:20:52 crc kubenswrapper[4693]: I1204 11:20:52.861891 4693 scope.go:117] "RemoveContainer" containerID="37f5b4da1eac4bd8530c08f10e1ab470ebfac38f4d7d7da5524b314f5c190a60" Dec 04 11:20:52 crc kubenswrapper[4693]: I1204 11:20:52.863024 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:20:52 crc kubenswrapper[4693]: E1204 11:20:52.863316 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:21:07 crc kubenswrapper[4693]: I1204 11:21:07.461104 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:21:07 crc kubenswrapper[4693]: E1204 11:21:07.461911 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:21:07 crc kubenswrapper[4693]: I1204 11:21:07.995745 4693 generic.go:334] "Generic (PLEG): container finished" podID="b3748e9d-952c-4b8a-adaf-0b4d226813a1" containerID="52a9f1f6c66ecb0f1aff1cd77ea7223a217922b350097760ae11166c815ac446" exitCode=0 Dec 04 11:21:07 crc kubenswrapper[4693]: I1204 11:21:07.995812 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fkp7/crc-debug-v875n" event={"ID":"b3748e9d-952c-4b8a-adaf-0b4d226813a1","Type":"ContainerDied","Data":"52a9f1f6c66ecb0f1aff1cd77ea7223a217922b350097760ae11166c815ac446"} Dec 04 11:21:09 crc kubenswrapper[4693]: I1204 11:21:09.156408 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fkp7/crc-debug-v875n" Dec 04 11:21:09 crc kubenswrapper[4693]: I1204 11:21:09.195027 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9fkp7/crc-debug-v875n"] Dec 04 11:21:09 crc kubenswrapper[4693]: I1204 11:21:09.203531 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9fkp7/crc-debug-v875n"] Dec 04 11:21:09 crc kubenswrapper[4693]: I1204 11:21:09.348361 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfcvh\" (UniqueName: \"kubernetes.io/projected/b3748e9d-952c-4b8a-adaf-0b4d226813a1-kube-api-access-cfcvh\") pod \"b3748e9d-952c-4b8a-adaf-0b4d226813a1\" (UID: \"b3748e9d-952c-4b8a-adaf-0b4d226813a1\") " Dec 04 11:21:09 crc kubenswrapper[4693]: I1204 11:21:09.348444 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3748e9d-952c-4b8a-adaf-0b4d226813a1-host\") pod \"b3748e9d-952c-4b8a-adaf-0b4d226813a1\" (UID: \"b3748e9d-952c-4b8a-adaf-0b4d226813a1\") " Dec 04 11:21:09 crc kubenswrapper[4693]: I1204 11:21:09.348654 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b3748e9d-952c-4b8a-adaf-0b4d226813a1-host" (OuterVolumeSpecName: "host") pod "b3748e9d-952c-4b8a-adaf-0b4d226813a1" (UID: "b3748e9d-952c-4b8a-adaf-0b4d226813a1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 11:21:09 crc kubenswrapper[4693]: I1204 11:21:09.348978 4693 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3748e9d-952c-4b8a-adaf-0b4d226813a1-host\") on node \"crc\" DevicePath \"\"" Dec 04 11:21:09 crc kubenswrapper[4693]: I1204 11:21:09.357112 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3748e9d-952c-4b8a-adaf-0b4d226813a1-kube-api-access-cfcvh" (OuterVolumeSpecName: "kube-api-access-cfcvh") pod "b3748e9d-952c-4b8a-adaf-0b4d226813a1" (UID: "b3748e9d-952c-4b8a-adaf-0b4d226813a1"). InnerVolumeSpecName "kube-api-access-cfcvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:21:09 crc kubenswrapper[4693]: I1204 11:21:09.451302 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfcvh\" (UniqueName: \"kubernetes.io/projected/b3748e9d-952c-4b8a-adaf-0b4d226813a1-kube-api-access-cfcvh\") on node \"crc\" DevicePath \"\"" Dec 04 11:21:10 crc kubenswrapper[4693]: I1204 11:21:10.014960 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8abec41f3fb3de6f5dc96863a6e14db7d4da6becac3074182dfdc6d0eeeb137" Dec 04 11:21:10 crc kubenswrapper[4693]: I1204 11:21:10.015231 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fkp7/crc-debug-v875n" Dec 04 11:21:10 crc kubenswrapper[4693]: I1204 11:21:10.334411 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9fkp7/crc-debug-pz424"] Dec 04 11:21:10 crc kubenswrapper[4693]: E1204 11:21:10.334878 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3748e9d-952c-4b8a-adaf-0b4d226813a1" containerName="container-00" Dec 04 11:21:10 crc kubenswrapper[4693]: I1204 11:21:10.334895 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3748e9d-952c-4b8a-adaf-0b4d226813a1" containerName="container-00" Dec 04 11:21:10 crc kubenswrapper[4693]: I1204 11:21:10.335141 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3748e9d-952c-4b8a-adaf-0b4d226813a1" containerName="container-00" Dec 04 11:21:10 crc kubenswrapper[4693]: I1204 11:21:10.336047 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fkp7/crc-debug-pz424" Dec 04 11:21:10 crc kubenswrapper[4693]: I1204 11:21:10.338031 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9fkp7"/"default-dockercfg-t5zhp" Dec 04 11:21:10 crc kubenswrapper[4693]: I1204 11:21:10.370302 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h76dr\" (UniqueName: \"kubernetes.io/projected/4ca42d57-536e-4723-9edb-c8f3253346c7-kube-api-access-h76dr\") pod \"crc-debug-pz424\" (UID: \"4ca42d57-536e-4723-9edb-c8f3253346c7\") " pod="openshift-must-gather-9fkp7/crc-debug-pz424" Dec 04 11:21:10 crc kubenswrapper[4693]: I1204 11:21:10.370618 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ca42d57-536e-4723-9edb-c8f3253346c7-host\") pod \"crc-debug-pz424\" (UID: \"4ca42d57-536e-4723-9edb-c8f3253346c7\") " pod="openshift-must-gather-9fkp7/crc-debug-pz424" Dec 04 11:21:10 crc kubenswrapper[4693]: I1204 11:21:10.471838 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h76dr\" (UniqueName: \"kubernetes.io/projected/4ca42d57-536e-4723-9edb-c8f3253346c7-kube-api-access-h76dr\") pod \"crc-debug-pz424\" (UID: \"4ca42d57-536e-4723-9edb-c8f3253346c7\") " pod="openshift-must-gather-9fkp7/crc-debug-pz424" Dec 04 11:21:10 crc kubenswrapper[4693]: I1204 11:21:10.472275 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ca42d57-536e-4723-9edb-c8f3253346c7-host\") pod \"crc-debug-pz424\" (UID: \"4ca42d57-536e-4723-9edb-c8f3253346c7\") " pod="openshift-must-gather-9fkp7/crc-debug-pz424" Dec 04 11:21:10 crc kubenswrapper[4693]: I1204 11:21:10.472593 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ca42d57-536e-4723-9edb-c8f3253346c7-host\") pod \"crc-debug-pz424\" (UID: \"4ca42d57-536e-4723-9edb-c8f3253346c7\") " pod="openshift-must-gather-9fkp7/crc-debug-pz424" Dec 04 11:21:10 crc kubenswrapper[4693]: I1204 11:21:10.474158 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3748e9d-952c-4b8a-adaf-0b4d226813a1" path="/var/lib/kubelet/pods/b3748e9d-952c-4b8a-adaf-0b4d226813a1/volumes" Dec 04 11:21:10 crc kubenswrapper[4693]: I1204 11:21:10.499873 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h76dr\" (UniqueName: \"kubernetes.io/projected/4ca42d57-536e-4723-9edb-c8f3253346c7-kube-api-access-h76dr\") pod \"crc-debug-pz424\" (UID: \"4ca42d57-536e-4723-9edb-c8f3253346c7\") " pod="openshift-must-gather-9fkp7/crc-debug-pz424" Dec 04 11:21:10 crc kubenswrapper[4693]: I1204 11:21:10.654788 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fkp7/crc-debug-pz424" Dec 04 11:21:11 crc kubenswrapper[4693]: I1204 11:21:11.027485 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fkp7/crc-debug-pz424" event={"ID":"4ca42d57-536e-4723-9edb-c8f3253346c7","Type":"ContainerStarted","Data":"09c7c6af1833271d3ba780987928c304d71e7545021218dd3b274d111344e38a"} Dec 04 11:21:12 crc kubenswrapper[4693]: I1204 11:21:12.039789 4693 generic.go:334] "Generic (PLEG): container finished" podID="4ca42d57-536e-4723-9edb-c8f3253346c7" containerID="1663c8986d5393fccb322ad97997696cdb89d3becf1418280c1dda8b4628b9f1" exitCode=0 Dec 04 11:21:12 crc kubenswrapper[4693]: I1204 11:21:12.040028 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fkp7/crc-debug-pz424" event={"ID":"4ca42d57-536e-4723-9edb-c8f3253346c7","Type":"ContainerDied","Data":"1663c8986d5393fccb322ad97997696cdb89d3becf1418280c1dda8b4628b9f1"} Dec 04 11:21:12 crc kubenswrapper[4693]: I1204 11:21:12.763787 4693 scope.go:117] "RemoveContainer" containerID="0149853cbc8adf0ea71815603b788829820ea1e08119d1d6b4c95f6a91ee8974" Dec 04 11:21:12 crc kubenswrapper[4693]: I1204 11:21:12.783245 4693 scope.go:117] "RemoveContainer" containerID="45cbe399558bd8b0bd7cd1936d37b75ee7b99fcf38ce72dcb98ea2f7d4f0ca91" Dec 04 11:21:12 crc kubenswrapper[4693]: I1204 11:21:12.803181 4693 scope.go:117] "RemoveContainer" containerID="dad58f727da88c4218f9496bc4af0697dae2cf2f128eb8419d17f8d56f5fb73a" Dec 04 11:21:13 crc kubenswrapper[4693]: I1204 11:21:13.169793 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fkp7/crc-debug-pz424" Dec 04 11:21:13 crc kubenswrapper[4693]: I1204 11:21:13.319860 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h76dr\" (UniqueName: \"kubernetes.io/projected/4ca42d57-536e-4723-9edb-c8f3253346c7-kube-api-access-h76dr\") pod \"4ca42d57-536e-4723-9edb-c8f3253346c7\" (UID: \"4ca42d57-536e-4723-9edb-c8f3253346c7\") " Dec 04 11:21:13 crc kubenswrapper[4693]: I1204 11:21:13.319945 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ca42d57-536e-4723-9edb-c8f3253346c7-host\") pod \"4ca42d57-536e-4723-9edb-c8f3253346c7\" (UID: \"4ca42d57-536e-4723-9edb-c8f3253346c7\") " Dec 04 11:21:13 crc kubenswrapper[4693]: I1204 11:21:13.320451 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ca42d57-536e-4723-9edb-c8f3253346c7-host" (OuterVolumeSpecName: "host") pod "4ca42d57-536e-4723-9edb-c8f3253346c7" (UID: "4ca42d57-536e-4723-9edb-c8f3253346c7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 11:21:13 crc kubenswrapper[4693]: I1204 11:21:13.331924 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ca42d57-536e-4723-9edb-c8f3253346c7-kube-api-access-h76dr" (OuterVolumeSpecName: "kube-api-access-h76dr") pod "4ca42d57-536e-4723-9edb-c8f3253346c7" (UID: "4ca42d57-536e-4723-9edb-c8f3253346c7"). InnerVolumeSpecName "kube-api-access-h76dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:21:13 crc kubenswrapper[4693]: I1204 11:21:13.422187 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h76dr\" (UniqueName: \"kubernetes.io/projected/4ca42d57-536e-4723-9edb-c8f3253346c7-kube-api-access-h76dr\") on node \"crc\" DevicePath \"\"" Dec 04 11:21:13 crc kubenswrapper[4693]: I1204 11:21:13.422227 4693 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ca42d57-536e-4723-9edb-c8f3253346c7-host\") on node \"crc\" DevicePath \"\"" Dec 04 11:21:14 crc kubenswrapper[4693]: I1204 11:21:14.064038 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fkp7/crc-debug-pz424" event={"ID":"4ca42d57-536e-4723-9edb-c8f3253346c7","Type":"ContainerDied","Data":"09c7c6af1833271d3ba780987928c304d71e7545021218dd3b274d111344e38a"} Dec 04 11:21:14 crc kubenswrapper[4693]: I1204 11:21:14.064137 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09c7c6af1833271d3ba780987928c304d71e7545021218dd3b274d111344e38a" Dec 04 11:21:14 crc kubenswrapper[4693]: I1204 11:21:14.064078 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fkp7/crc-debug-pz424" Dec 04 11:21:14 crc kubenswrapper[4693]: I1204 11:21:14.966206 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9fkp7/crc-debug-pz424"] Dec 04 11:21:14 crc kubenswrapper[4693]: I1204 11:21:14.976894 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9fkp7/crc-debug-pz424"] Dec 04 11:21:16 crc kubenswrapper[4693]: I1204 11:21:16.125764 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9fkp7/crc-debug-jp9ws"] Dec 04 11:21:16 crc kubenswrapper[4693]: E1204 11:21:16.126663 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ca42d57-536e-4723-9edb-c8f3253346c7" containerName="container-00" Dec 04 11:21:16 crc kubenswrapper[4693]: I1204 11:21:16.126679 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca42d57-536e-4723-9edb-c8f3253346c7" containerName="container-00" Dec 04 11:21:16 crc kubenswrapper[4693]: I1204 11:21:16.126854 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ca42d57-536e-4723-9edb-c8f3253346c7" containerName="container-00" Dec 04 11:21:16 crc kubenswrapper[4693]: I1204 11:21:16.127490 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fkp7/crc-debug-jp9ws" Dec 04 11:21:16 crc kubenswrapper[4693]: I1204 11:21:16.129370 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9fkp7"/"default-dockercfg-t5zhp" Dec 04 11:21:16 crc kubenswrapper[4693]: I1204 11:21:16.278199 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5059293d-f8da-4487-9b36-a7ca7aead90c-host\") pod \"crc-debug-jp9ws\" (UID: \"5059293d-f8da-4487-9b36-a7ca7aead90c\") " pod="openshift-must-gather-9fkp7/crc-debug-jp9ws" Dec 04 11:21:16 crc kubenswrapper[4693]: I1204 11:21:16.278658 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqsd2\" (UniqueName: \"kubernetes.io/projected/5059293d-f8da-4487-9b36-a7ca7aead90c-kube-api-access-gqsd2\") pod \"crc-debug-jp9ws\" (UID: \"5059293d-f8da-4487-9b36-a7ca7aead90c\") " pod="openshift-must-gather-9fkp7/crc-debug-jp9ws" Dec 04 11:21:16 crc kubenswrapper[4693]: I1204 11:21:16.380922 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqsd2\" (UniqueName: \"kubernetes.io/projected/5059293d-f8da-4487-9b36-a7ca7aead90c-kube-api-access-gqsd2\") pod \"crc-debug-jp9ws\" (UID: \"5059293d-f8da-4487-9b36-a7ca7aead90c\") " pod="openshift-must-gather-9fkp7/crc-debug-jp9ws" Dec 04 11:21:16 crc kubenswrapper[4693]: I1204 11:21:16.381070 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5059293d-f8da-4487-9b36-a7ca7aead90c-host\") pod \"crc-debug-jp9ws\" (UID: \"5059293d-f8da-4487-9b36-a7ca7aead90c\") " pod="openshift-must-gather-9fkp7/crc-debug-jp9ws" Dec 04 11:21:16 crc kubenswrapper[4693]: I1204 11:21:16.381183 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5059293d-f8da-4487-9b36-a7ca7aead90c-host\") pod \"crc-debug-jp9ws\" (UID: \"5059293d-f8da-4487-9b36-a7ca7aead90c\") " pod="openshift-must-gather-9fkp7/crc-debug-jp9ws" Dec 04 11:21:16 crc kubenswrapper[4693]: I1204 11:21:16.400045 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqsd2\" (UniqueName: \"kubernetes.io/projected/5059293d-f8da-4487-9b36-a7ca7aead90c-kube-api-access-gqsd2\") pod \"crc-debug-jp9ws\" (UID: \"5059293d-f8da-4487-9b36-a7ca7aead90c\") " pod="openshift-must-gather-9fkp7/crc-debug-jp9ws" Dec 04 11:21:16 crc kubenswrapper[4693]: I1204 11:21:16.443186 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fkp7/crc-debug-jp9ws" Dec 04 11:21:16 crc kubenswrapper[4693]: I1204 11:21:16.478786 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ca42d57-536e-4723-9edb-c8f3253346c7" path="/var/lib/kubelet/pods/4ca42d57-536e-4723-9edb-c8f3253346c7/volumes" Dec 04 11:21:17 crc kubenswrapper[4693]: I1204 11:21:17.095034 4693 generic.go:334] "Generic (PLEG): container finished" podID="5059293d-f8da-4487-9b36-a7ca7aead90c" containerID="1feb874b5d52304a7561e570adf6f21be4e7b0805263e94f1e3c4b19990ce176" exitCode=0 Dec 04 11:21:17 crc kubenswrapper[4693]: I1204 11:21:17.095213 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fkp7/crc-debug-jp9ws" event={"ID":"5059293d-f8da-4487-9b36-a7ca7aead90c","Type":"ContainerDied","Data":"1feb874b5d52304a7561e570adf6f21be4e7b0805263e94f1e3c4b19990ce176"} Dec 04 11:21:17 crc kubenswrapper[4693]: I1204 11:21:17.095443 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fkp7/crc-debug-jp9ws" event={"ID":"5059293d-f8da-4487-9b36-a7ca7aead90c","Type":"ContainerStarted","Data":"d8cc632505dd3f9bc565354b86e362c418a49785d46789ef5c7ce333e49e0719"} Dec 04 11:21:17 crc kubenswrapper[4693]: I1204 11:21:17.147689 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9fkp7/crc-debug-jp9ws"] Dec 04 11:21:17 crc kubenswrapper[4693]: I1204 11:21:17.163630 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9fkp7/crc-debug-jp9ws"] Dec 04 11:21:18 crc kubenswrapper[4693]: I1204 11:21:18.206749 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fkp7/crc-debug-jp9ws" Dec 04 11:21:18 crc kubenswrapper[4693]: I1204 11:21:18.316311 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqsd2\" (UniqueName: \"kubernetes.io/projected/5059293d-f8da-4487-9b36-a7ca7aead90c-kube-api-access-gqsd2\") pod \"5059293d-f8da-4487-9b36-a7ca7aead90c\" (UID: \"5059293d-f8da-4487-9b36-a7ca7aead90c\") " Dec 04 11:21:18 crc kubenswrapper[4693]: I1204 11:21:18.316641 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5059293d-f8da-4487-9b36-a7ca7aead90c-host\") pod \"5059293d-f8da-4487-9b36-a7ca7aead90c\" (UID: \"5059293d-f8da-4487-9b36-a7ca7aead90c\") " Dec 04 11:21:18 crc kubenswrapper[4693]: I1204 11:21:18.316760 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5059293d-f8da-4487-9b36-a7ca7aead90c-host" (OuterVolumeSpecName: "host") pod "5059293d-f8da-4487-9b36-a7ca7aead90c" (UID: "5059293d-f8da-4487-9b36-a7ca7aead90c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 11:21:18 crc kubenswrapper[4693]: I1204 11:21:18.317122 4693 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5059293d-f8da-4487-9b36-a7ca7aead90c-host\") on node \"crc\" DevicePath \"\"" Dec 04 11:21:18 crc kubenswrapper[4693]: I1204 11:21:18.337063 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5059293d-f8da-4487-9b36-a7ca7aead90c-kube-api-access-gqsd2" (OuterVolumeSpecName: "kube-api-access-gqsd2") pod "5059293d-f8da-4487-9b36-a7ca7aead90c" (UID: "5059293d-f8da-4487-9b36-a7ca7aead90c"). InnerVolumeSpecName "kube-api-access-gqsd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:21:18 crc kubenswrapper[4693]: I1204 11:21:18.418660 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqsd2\" (UniqueName: \"kubernetes.io/projected/5059293d-f8da-4487-9b36-a7ca7aead90c-kube-api-access-gqsd2\") on node \"crc\" DevicePath \"\"" Dec 04 11:21:18 crc kubenswrapper[4693]: I1204 11:21:18.470687 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5059293d-f8da-4487-9b36-a7ca7aead90c" path="/var/lib/kubelet/pods/5059293d-f8da-4487-9b36-a7ca7aead90c/volumes" Dec 04 11:21:18 crc kubenswrapper[4693]: E1204 11:21:18.566487 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5059293d_f8da_4487_9b36_a7ca7aead90c.slice/crio-d8cc632505dd3f9bc565354b86e362c418a49785d46789ef5c7ce333e49e0719\": RecentStats: unable to find data in memory cache]" Dec 04 11:21:19 crc kubenswrapper[4693]: I1204 11:21:19.115644 4693 scope.go:117] "RemoveContainer" containerID="1feb874b5d52304a7561e570adf6f21be4e7b0805263e94f1e3c4b19990ce176" Dec 04 11:21:19 crc kubenswrapper[4693]: I1204 11:21:19.115711 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fkp7/crc-debug-jp9ws" Dec 04 11:21:22 crc kubenswrapper[4693]: I1204 11:21:22.461202 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:21:22 crc kubenswrapper[4693]: E1204 11:21:22.461935 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:21:32 crc kubenswrapper[4693]: I1204 11:21:32.861497 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5dcd5dc8-znmkm_c8aed54b-9500-4b6d-a966-64fb3cff7b45/barbican-api/0.log" Dec 04 11:21:33 crc kubenswrapper[4693]: I1204 11:21:33.021853 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5dcd5dc8-znmkm_c8aed54b-9500-4b6d-a966-64fb3cff7b45/barbican-api-log/0.log" Dec 04 11:21:33 crc kubenswrapper[4693]: I1204 11:21:33.061363 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76bdf94f96-jnvk8_33e00b6b-bd3b-4198-8333-1515f919cbfc/barbican-keystone-listener/0.log" Dec 04 11:21:33 crc kubenswrapper[4693]: I1204 11:21:33.359076 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-d7c5fcd89-hgn5p_ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340/barbican-worker/0.log" Dec 04 11:21:33 crc kubenswrapper[4693]: I1204 11:21:33.376008 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-d7c5fcd89-hgn5p_ae6e87d6-e55f-4cb2-8e6d-103bfdfb4340/barbican-worker-log/0.log" Dec 04 11:21:33 crc kubenswrapper[4693]: I1204 11:21:33.462087 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:21:33 crc kubenswrapper[4693]: E1204 11:21:33.462613 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:21:33 crc kubenswrapper[4693]: I1204 11:21:33.665284 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-rpbpv_bd60e9f3-ac52-4a2b-9e3b-80720e7634ab/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:21:33 crc kubenswrapper[4693]: I1204 11:21:33.815784 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3fcae802-3512-4246-bbe8-fc93ecb2505d/ceilometer-central-agent/0.log" Dec 04 11:21:33 crc kubenswrapper[4693]: I1204 11:21:33.920521 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3fcae802-3512-4246-bbe8-fc93ecb2505d/proxy-httpd/0.log" Dec 04 11:21:33 crc kubenswrapper[4693]: I1204 11:21:33.952318 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3fcae802-3512-4246-bbe8-fc93ecb2505d/ceilometer-notification-agent/0.log" Dec 04 11:21:34 crc kubenswrapper[4693]: I1204 11:21:34.029103 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3fcae802-3512-4246-bbe8-fc93ecb2505d/sg-core/0.log" Dec 04 11:21:34 crc kubenswrapper[4693]: I1204 11:21:34.039418 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76bdf94f96-jnvk8_33e00b6b-bd3b-4198-8333-1515f919cbfc/barbican-keystone-listener-log/0.log" Dec 04 11:21:34 crc kubenswrapper[4693]: I1204 11:21:34.296201 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph_5f0438dc-0fdb-48e2-a807-3292d8bb3fed/ceph/0.log" Dec 04 11:21:34 crc kubenswrapper[4693]: I1204 11:21:34.671242 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_bfcdc882-9b7b-4e42-877e-6e8be8597470/cinder-api/0.log" Dec 04 11:21:34 crc kubenswrapper[4693]: I1204 11:21:34.891391 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_ca5c459b-21a7-4799-a516-2a270de6e246/probe/0.log" Dec 04 11:21:34 crc kubenswrapper[4693]: I1204 11:21:34.918894 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_bfcdc882-9b7b-4e42-877e-6e8be8597470/cinder-api-log/0.log" Dec 04 11:21:35 crc kubenswrapper[4693]: I1204 11:21:35.224918 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_487df7df-e43a-48a6-8350-6b9804d13e39/cinder-scheduler/0.log" Dec 04 11:21:35 crc kubenswrapper[4693]: I1204 11:21:35.318784 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_487df7df-e43a-48a6-8350-6b9804d13e39/probe/0.log" Dec 04 11:21:35 crc kubenswrapper[4693]: I1204 11:21:35.635137 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_ca5c459b-21a7-4799-a516-2a270de6e246/cinder-backup/0.log" Dec 04 11:21:35 crc kubenswrapper[4693]: I1204 11:21:35.653399 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_711f27ff-01df-4851-bc37-a7115b5fa624/probe/0.log" Dec 04 11:21:35 crc kubenswrapper[4693]: I1204 11:21:35.800133 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-994x5_5aeee95a-198f-47ed-859b-0f710da9768c/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:21:35 crc kubenswrapper[4693]: I1204 11:21:35.995729 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-2p64q_b9ee0ede-3c46-4a69-b2e5-00901d0ee8a6/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:21:36 crc kubenswrapper[4693]: I1204 11:21:36.170863 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-84b5f8b59f-krd95_60e50add-23a4-48de-a35c-0275bab951b1/init/0.log" Dec 04 11:21:36 crc kubenswrapper[4693]: I1204 11:21:36.385524 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-84b5f8b59f-krd95_60e50add-23a4-48de-a35c-0275bab951b1/init/0.log" Dec 04 11:21:36 crc kubenswrapper[4693]: I1204 11:21:36.600205 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-84b5f8b59f-krd95_60e50add-23a4-48de-a35c-0275bab951b1/dnsmasq-dns/0.log" Dec 04 11:21:36 crc kubenswrapper[4693]: I1204 11:21:36.642503 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-b8ktz_710e62b8-160d-49f9-8bdb-418a0ee9f379/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:21:36 crc kubenswrapper[4693]: I1204 11:21:36.817161 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_56bd6fe8-e97b-4c07-a204-ee44c09401b7/glance-log/0.log" Dec 04 11:21:36 crc kubenswrapper[4693]: I1204 11:21:36.874065 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_56bd6fe8-e97b-4c07-a204-ee44c09401b7/glance-httpd/0.log" Dec 04 11:21:37 crc kubenswrapper[4693]: I1204 11:21:37.063925 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e1d7ce8c-35a9-406c-9b7d-10e4976bb156/glance-httpd/0.log" Dec 04 11:21:37 crc kubenswrapper[4693]: I1204 11:21:37.074341 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e1d7ce8c-35a9-406c-9b7d-10e4976bb156/glance-log/0.log" Dec 04 11:21:37 crc kubenswrapper[4693]: I1204 11:21:37.429128 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-bzq9l_8ba7c573-448f-438c-9999-ffe4e8c28f52/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:21:37 crc kubenswrapper[4693]: I1204 11:21:37.435796 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6f8cd9d6cb-vf5bx_ca2592b0-5dfd-4d15-996c-2340af86bd26/horizon/0.log" Dec 04 11:21:37 crc kubenswrapper[4693]: I1204 11:21:37.681383 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-cld2n_c8c97263-d34a-4231-9f52-5f0aae7163f2/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:21:38 crc kubenswrapper[4693]: I1204 11:21:38.005851 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29414101-snzll_e7a046b6-d7ff-46b3-a107-58e6e843bfff/keystone-cron/0.log" Dec 04 11:21:38 crc kubenswrapper[4693]: I1204 11:21:38.226662 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4f9ccd88-ae2a-4026-b492-a09d29799c89/kube-state-metrics/0.log" Dec 04 11:21:38 crc kubenswrapper[4693]: I1204 11:21:38.319711 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6f8cd9d6cb-vf5bx_ca2592b0-5dfd-4d15-996c-2340af86bd26/horizon-log/0.log" Dec 04 11:21:38 crc kubenswrapper[4693]: I1204 11:21:38.455371 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_711f27ff-01df-4851-bc37-a7115b5fa624/cinder-volume/0.log" Dec 04 11:21:38 crc kubenswrapper[4693]: I1204 11:21:38.484528 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-lxb8p_a943a73c-465d-4a30-be17-967c79007a91/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:21:38 crc kubenswrapper[4693]: I1204 11:21:38.969732 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_5f40f194-33e3-4723-817f-981394e545b9/probe/0.log" Dec 04 11:21:39 crc kubenswrapper[4693]: I1204 11:21:39.335890 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_5f40f194-33e3-4723-817f-981394e545b9/manila-scheduler/0.log" Dec 04 11:21:39 crc kubenswrapper[4693]: I1204 11:21:39.388038 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_8b6dc0f8-064a-4748-b69a-11713fe55088/manila-api/0.log" Dec 04 11:21:39 crc kubenswrapper[4693]: I1204 11:21:39.592235 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_33b8b8b4-d56e-4c4c-9e87-95d334534e74/probe/0.log" Dec 04 11:21:39 crc kubenswrapper[4693]: I1204 11:21:39.977291 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_8b6dc0f8-064a-4748-b69a-11713fe55088/manila-api-log/0.log" Dec 04 11:21:40 crc kubenswrapper[4693]: I1204 11:21:40.064086 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_33b8b8b4-d56e-4c4c-9e87-95d334534e74/manila-share/0.log" Dec 04 11:21:40 crc kubenswrapper[4693]: I1204 11:21:40.555729 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-trz2s_cbd12578-e1a3-41b0-95be-2162e189daae/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:21:40 crc kubenswrapper[4693]: I1204 11:21:40.855140 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-77d49c9649-fpwft_c2c81aab-5f08-429a-941e-9890ef46273e/neutron-httpd/0.log" Dec 04 11:21:41 crc kubenswrapper[4693]: I1204 11:21:41.988581 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-77d49c9649-fpwft_c2c81aab-5f08-429a-941e-9890ef46273e/neutron-api/0.log" Dec 04 11:21:42 crc kubenswrapper[4693]: I1204 11:21:42.888067 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7f774bdc67-hjxts_fc3b6747-ed65-46ef-8034-e35edf80ac90/keystone-api/0.log" Dec 04 11:21:42 crc kubenswrapper[4693]: I1204 11:21:42.968436 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_2ea5071f-1037-494c-b12f-ebddb5deb122/nova-cell0-conductor-conductor/0.log" Dec 04 11:21:43 crc kubenswrapper[4693]: I1204 11:21:43.390505 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d658460b-438a-46f0-88e1-136741999c81/nova-api-log/0.log" Dec 04 11:21:43 crc kubenswrapper[4693]: I1204 11:21:43.485707 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_7cece472-f359-4ce8-b1f8-17ca920f4b3d/nova-cell1-conductor-conductor/0.log" Dec 04 11:21:43 crc kubenswrapper[4693]: I1204 11:21:43.835065 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_bba2b0cd-4556-4a03-a111-d73471571173/nova-cell1-novncproxy-novncproxy/0.log" Dec 04 11:21:43 crc kubenswrapper[4693]: I1204 11:21:43.848313 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-jj8z6_50c8246f-670e-4056-9c35-19e8042a96bf/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:21:44 crc kubenswrapper[4693]: I1204 11:21:44.153451 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bc70aae2-116d-4528-8a5a-efab89d7e53b/nova-metadata-log/0.log" Dec 04 11:21:44 crc kubenswrapper[4693]: I1204 11:21:44.226410 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d658460b-438a-46f0-88e1-136741999c81/nova-api-api/0.log" Dec 04 11:21:44 crc kubenswrapper[4693]: I1204 11:21:44.494015 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b85d3f3e-5811-4829-8b36-96ecb7f22492/mysql-bootstrap/0.log" Dec 04 11:21:44 crc kubenswrapper[4693]: I1204 11:21:44.653596 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b85d3f3e-5811-4829-8b36-96ecb7f22492/mysql-bootstrap/0.log" Dec 04 11:21:44 crc kubenswrapper[4693]: I1204 11:21:44.725976 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b85d3f3e-5811-4829-8b36-96ecb7f22492/galera/0.log" Dec 04 11:21:44 crc kubenswrapper[4693]: I1204 11:21:44.862520 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ca70cccf-b92e-4997-9ca9-1375a2cceca1/nova-scheduler-scheduler/0.log" Dec 04 11:21:44 crc kubenswrapper[4693]: I1204 11:21:44.983104 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e55c8437-1394-45c9-b135-2dbe68895d38/mysql-bootstrap/0.log" Dec 04 11:21:45 crc kubenswrapper[4693]: I1204 11:21:45.170437 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e55c8437-1394-45c9-b135-2dbe68895d38/mysql-bootstrap/0.log" Dec 04 11:21:45 crc kubenswrapper[4693]: I1204 11:21:45.263774 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e55c8437-1394-45c9-b135-2dbe68895d38/galera/0.log" Dec 04 11:21:45 crc kubenswrapper[4693]: I1204 11:21:45.363134 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b2fefcbd-5f7f-4544-8f03-49adbe23a11b/openstackclient/0.log" Dec 04 11:21:45 crc kubenswrapper[4693]: I1204 11:21:45.526420 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-clqww_e8b66ffe-c672-438c-ab15-a4a44563152d/openstack-network-exporter/0.log" Dec 04 11:21:45 crc kubenswrapper[4693]: I1204 11:21:45.646011 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7nh5c_cd544a1e-c7e1-4f04-90f5-d9cf152c4f12/ovsdb-server-init/0.log" Dec 04 11:21:45 crc kubenswrapper[4693]: I1204 11:21:45.961863 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7nh5c_cd544a1e-c7e1-4f04-90f5-d9cf152c4f12/ovsdb-server-init/0.log" Dec 04 11:21:45 crc kubenswrapper[4693]: I1204 11:21:45.980862 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7nh5c_cd544a1e-c7e1-4f04-90f5-d9cf152c4f12/ovs-vswitchd/0.log" Dec 04 11:21:45 crc kubenswrapper[4693]: I1204 11:21:45.996064 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7nh5c_cd544a1e-c7e1-4f04-90f5-d9cf152c4f12/ovsdb-server/0.log" Dec 04 11:21:46 crc kubenswrapper[4693]: I1204 11:21:46.231323 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-zczb4_ef9b299a-d8f0-45b8-97ea-d352d2c6ac6a/ovn-controller/0.log" Dec 04 11:21:46 crc kubenswrapper[4693]: I1204 11:21:46.342752 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bc70aae2-116d-4528-8a5a-efab89d7e53b/nova-metadata-metadata/0.log" Dec 04 11:21:46 crc kubenswrapper[4693]: I1204 11:21:46.483861 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-wkvzh_ab7d9721-bbc8-489e-96de-98ce148725de/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:21:46 crc kubenswrapper[4693]: I1204 11:21:46.568163 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_eec5c741-f1c6-424f-b3e1-4f5219fa0bf0/ovn-northd/0.log" Dec 04 11:21:46 crc kubenswrapper[4693]: I1204 11:21:46.583893 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_eec5c741-f1c6-424f-b3e1-4f5219fa0bf0/openstack-network-exporter/0.log" Dec 04 11:21:46 crc kubenswrapper[4693]: I1204 11:21:46.761783 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_14639c36-341c-4f90-980b-b9fffce3c8f8/ovsdbserver-nb/0.log" Dec 04 11:21:46 crc kubenswrapper[4693]: I1204 11:21:46.765533 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_14639c36-341c-4f90-980b-b9fffce3c8f8/openstack-network-exporter/0.log" Dec 04 11:21:46 crc kubenswrapper[4693]: I1204 11:21:46.998361 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9999b422-d127-4990-8091-9446e589839a/ovsdbserver-sb/0.log" Dec 04 11:21:47 crc kubenswrapper[4693]: I1204 11:21:47.002284 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9999b422-d127-4990-8091-9446e589839a/openstack-network-exporter/0.log" Dec 04 11:21:47 crc kubenswrapper[4693]: I1204 11:21:47.261123 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c1ee328e-d29f-4224-913b-bc23195bf2b2/setup-container/0.log" Dec 04 11:21:47 crc kubenswrapper[4693]: I1204 11:21:47.459127 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-596dc75986-wjgrk_7e6c8844-46bb-47e2-99d2-a9da861757e7/placement-api/0.log" Dec 04 11:21:47 crc kubenswrapper[4693]: I1204 11:21:47.460975 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:21:47 crc kubenswrapper[4693]: E1204 11:21:47.461288 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:21:47 crc kubenswrapper[4693]: I1204 11:21:47.551002 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c1ee328e-d29f-4224-913b-bc23195bf2b2/setup-container/0.log" Dec 04 11:21:47 crc kubenswrapper[4693]: I1204 11:21:47.595890 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c1ee328e-d29f-4224-913b-bc23195bf2b2/rabbitmq/0.log" Dec 04 11:21:47 crc kubenswrapper[4693]: I1204 11:21:47.777859 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-596dc75986-wjgrk_7e6c8844-46bb-47e2-99d2-a9da861757e7/placement-log/0.log" Dec 04 11:21:47 crc kubenswrapper[4693]: I1204 11:21:47.794849 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c0a72230-d599-4df6-bd4b-279092bf8861/setup-container/0.log" Dec 04 11:21:47 crc kubenswrapper[4693]: I1204 11:21:47.932122 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c0a72230-d599-4df6-bd4b-279092bf8861/setup-container/0.log" Dec 04 11:21:47 crc kubenswrapper[4693]: I1204 11:21:47.955775 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c0a72230-d599-4df6-bd4b-279092bf8861/rabbitmq/0.log" Dec 04 11:21:47 crc kubenswrapper[4693]: I1204 11:21:47.968827 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-xlrd5_db089c88-39e5-4e8f-93b8-b02a59f50b93/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:21:48 crc kubenswrapper[4693]: I1204 11:21:48.138092 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-ksdq9_80be6b5e-e208-4c31-a663-4a01f460ea18/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:21:48 crc kubenswrapper[4693]: I1204 11:21:48.339259 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-fcpx5_21d1c5c3-32a7-4aa6-9eab-a4fcc3d15288/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:21:48 crc kubenswrapper[4693]: I1204 11:21:48.513073 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-5wn5w_a57888c7-06f9-478c-9e80-3c028cabcb28/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:21:48 crc kubenswrapper[4693]: I1204 11:21:48.617644 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-tkkcn_ff072e20-bb88-4bc8-8e07-a12912774161/ssh-known-hosts-edpm-deployment/0.log" Dec 04 11:21:48 crc kubenswrapper[4693]: I1204 11:21:48.865066 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-57857fb86f-8m84s_3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a/proxy-server/0.log" Dec 04 11:21:48 crc kubenswrapper[4693]: I1204 11:21:48.930503 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-57857fb86f-8m84s_3a23d4b8-5c5b-4fdf-b5bc-ab82acef3e0a/proxy-httpd/0.log" Dec 04 11:21:48 crc kubenswrapper[4693]: I1204 11:21:48.955093 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-xz7js_39170f53-93c9-49fd-8dba-42d325269e74/swift-ring-rebalance/0.log" Dec 04 11:21:49 crc kubenswrapper[4693]: I1204 11:21:49.169052 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/account-auditor/0.log" Dec 04 11:21:49 crc kubenswrapper[4693]: I1204 11:21:49.171506 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/account-reaper/0.log" Dec 04 11:21:49 crc kubenswrapper[4693]: I1204 11:21:49.203377 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/account-replicator/0.log" Dec 04 11:21:49 crc kubenswrapper[4693]: I1204 11:21:49.399178 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/container-auditor/0.log" Dec 04 11:21:49 crc kubenswrapper[4693]: I1204 11:21:49.422341 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/container-replicator/0.log" Dec 04 11:21:49 crc kubenswrapper[4693]: I1204 11:21:49.431858 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/account-server/0.log" Dec 04 11:21:49 crc kubenswrapper[4693]: I1204 11:21:49.435243 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/container-server/0.log" Dec 04 11:21:49 crc kubenswrapper[4693]: I1204 11:21:49.575507 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/container-updater/0.log" Dec 04 11:21:49 crc kubenswrapper[4693]: I1204 11:21:49.641264 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/object-expirer/0.log" Dec 04 11:21:49 crc kubenswrapper[4693]: I1204 11:21:49.684251 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/object-auditor/0.log" Dec 04 11:21:49 crc kubenswrapper[4693]: I1204 11:21:49.687280 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/object-replicator/0.log" Dec 04 11:21:49 crc kubenswrapper[4693]: I1204 11:21:49.822452 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/object-updater/0.log" Dec 04 11:21:49 crc kubenswrapper[4693]: I1204 11:21:49.853584 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/object-server/0.log" Dec 04 11:21:49 crc kubenswrapper[4693]: I1204 11:21:49.925037 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/rsync/0.log" Dec 04 11:21:49 crc kubenswrapper[4693]: I1204 11:21:49.949821 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73554998-24a4-4d23-a78d-66d51cbe24af/swift-recon-cron/0.log" Dec 04 11:21:50 crc kubenswrapper[4693]: I1204 11:21:50.114153 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-5l6x5_ba3e6d3f-0285-4742-80ba-4c15da05164c/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:21:50 crc kubenswrapper[4693]: I1204 11:21:50.264418 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_da1a36ac-5a8b-475d-8434-eb43b0f8a656/tempest-tests-tempest-tests-runner/0.log" Dec 04 11:21:50 crc kubenswrapper[4693]: I1204 11:21:50.388711 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0e4ff807-907a-4310-97c4-7e60e55dcaca/test-operator-logs-container/0.log" Dec 04 11:21:50 crc kubenswrapper[4693]: I1204 11:21:50.533845 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-t6vwn_bbf583ab-d797-4781-a13e-d4d493483d3e/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:22:00 crc kubenswrapper[4693]: I1204 11:22:00.461165 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:22:00 crc kubenswrapper[4693]: E1204 11:22:00.462100 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:22:06 crc kubenswrapper[4693]: I1204 11:22:06.065038 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f304446b-e129-40a5-bc56-a79d0b973f0a/memcached/0.log" Dec 04 11:22:14 crc kubenswrapper[4693]: I1204 11:22:14.461226 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:22:14 crc kubenswrapper[4693]: E1204 11:22:14.462048 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:22:17 crc kubenswrapper[4693]: I1204 11:22:17.433000 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4_53398369-8904-43ae-be7b-ced663828e5e/util/0.log" Dec 04 11:22:17 crc kubenswrapper[4693]: I1204 11:22:17.646012 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4_53398369-8904-43ae-be7b-ced663828e5e/util/0.log" Dec 04 11:22:17 crc kubenswrapper[4693]: I1204 11:22:17.650747 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4_53398369-8904-43ae-be7b-ced663828e5e/pull/0.log" Dec 04 11:22:17 crc kubenswrapper[4693]: I1204 11:22:17.682360 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4_53398369-8904-43ae-be7b-ced663828e5e/pull/0.log" Dec 04 11:22:17 crc kubenswrapper[4693]: I1204 11:22:17.826714 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4_53398369-8904-43ae-be7b-ced663828e5e/util/0.log" Dec 04 11:22:17 crc kubenswrapper[4693]: I1204 11:22:17.839359 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4_53398369-8904-43ae-be7b-ced663828e5e/pull/0.log" Dec 04 11:22:17 crc kubenswrapper[4693]: I1204 11:22:17.872807 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_14e2890b8fcd2f16f85bddfbabb90ea8a4bea946ba86f42a442be37689bbxr4_53398369-8904-43ae-be7b-ced663828e5e/extract/0.log" Dec 04 11:22:18 crc kubenswrapper[4693]: I1204 11:22:18.009993 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-vtgdw_5aa92828-abd4-4f89-9621-5e9830101fca/kube-rbac-proxy/0.log" Dec 04 11:22:18 crc kubenswrapper[4693]: I1204 11:22:18.101294 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-vtgdw_5aa92828-abd4-4f89-9621-5e9830101fca/manager/0.log" Dec 04 11:22:18 crc kubenswrapper[4693]: I1204 11:22:18.107096 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-5j4cs_f3a27983-d919-48fb-a227-f6a45efef985/kube-rbac-proxy/0.log" Dec 04 11:22:18 crc kubenswrapper[4693]: I1204 11:22:18.255884 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-5j4cs_f3a27983-d919-48fb-a227-f6a45efef985/manager/0.log" Dec 04 11:22:18 crc kubenswrapper[4693]: I1204 11:22:18.339327 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-q8crl_2bb25289-630f-46c3-96f0-b5ea8177f5d8/manager/0.log" Dec 04 11:22:18 crc kubenswrapper[4693]: I1204 11:22:18.367610 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-q8crl_2bb25289-630f-46c3-96f0-b5ea8177f5d8/kube-rbac-proxy/0.log" Dec 04 11:22:18 crc kubenswrapper[4693]: I1204 11:22:18.543147 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-tnzrv_7fb21378-fa3f-41a2-a6da-80831acec23c/kube-rbac-proxy/0.log" Dec 04 11:22:18 crc kubenswrapper[4693]: I1204 11:22:18.615680 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-tnzrv_7fb21378-fa3f-41a2-a6da-80831acec23c/manager/0.log" Dec 04 11:22:18 crc kubenswrapper[4693]: I1204 11:22:18.667516 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-jgmqz_4ba18ef1-50c1-48d0-9d2e-3c83c65913ab/kube-rbac-proxy/0.log" Dec 04 11:22:18 crc kubenswrapper[4693]: I1204 11:22:18.749504 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-jgmqz_4ba18ef1-50c1-48d0-9d2e-3c83c65913ab/manager/0.log" Dec 04 11:22:18 crc kubenswrapper[4693]: I1204 11:22:18.826877 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-4gqg9_79466fca-aa64-407e-9488-d89e43d4bed9/kube-rbac-proxy/0.log" Dec 04 11:22:18 crc kubenswrapper[4693]: I1204 11:22:18.868175 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-4gqg9_79466fca-aa64-407e-9488-d89e43d4bed9/manager/0.log" Dec 04 11:22:19 crc kubenswrapper[4693]: I1204 11:22:19.030177 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-46wh4_b7bce599-dd9d-43c5-b5a9-53a081b6f183/kube-rbac-proxy/0.log" Dec 04 11:22:19 crc kubenswrapper[4693]: I1204 11:22:19.224507 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-qcc4g_37983465-c081-4645-9a0b-47431d284dbe/kube-rbac-proxy/0.log" Dec 04 11:22:19 crc kubenswrapper[4693]: I1204 11:22:19.224659 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-qcc4g_37983465-c081-4645-9a0b-47431d284dbe/manager/0.log" Dec 04 11:22:19 crc kubenswrapper[4693]: I1204 11:22:19.238444 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-46wh4_b7bce599-dd9d-43c5-b5a9-53a081b6f183/manager/0.log" Dec 04 11:22:19 crc kubenswrapper[4693]: I1204 11:22:19.428299 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-gtwgq_65a2270f-58bd-486b-9be3-c85fee980070/kube-rbac-proxy/0.log" Dec 04 11:22:19 crc kubenswrapper[4693]: I1204 11:22:19.507357 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-gtwgq_65a2270f-58bd-486b-9be3-c85fee980070/manager/0.log" Dec 04 11:22:19 crc kubenswrapper[4693]: I1204 11:22:19.521645 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-4mp89_1de6adcf-e847-4a10-af8c-683f83c32551/kube-rbac-proxy/0.log" Dec 04 11:22:19 crc kubenswrapper[4693]: I1204 11:22:19.674781 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-np6kl_ae731a83-bab7-4843-b413-e8b03a3ca1c3/kube-rbac-proxy/0.log" Dec 04 11:22:19 crc kubenswrapper[4693]: I1204 11:22:19.711004 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-4mp89_1de6adcf-e847-4a10-af8c-683f83c32551/manager/0.log" Dec 04 11:22:19 crc kubenswrapper[4693]: I1204 11:22:19.747430 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-np6kl_ae731a83-bab7-4843-b413-e8b03a3ca1c3/manager/0.log" Dec 04 11:22:19 crc kubenswrapper[4693]: I1204 11:22:19.885992 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-w7zh6_92ac4c28-9d59-4955-b5cf-ae45e97fdeed/kube-rbac-proxy/0.log" Dec 04 11:22:19 crc kubenswrapper[4693]: I1204 11:22:19.917620 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-w7zh6_92ac4c28-9d59-4955-b5cf-ae45e97fdeed/manager/0.log" Dec 04 11:22:20 crc kubenswrapper[4693]: I1204 11:22:20.082184 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-klr7c_fc97bab3-20bf-4931-868d-a20ad433cc81/kube-rbac-proxy/0.log" Dec 04 11:22:20 crc kubenswrapper[4693]: I1204 11:22:20.183667 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-klr7c_fc97bab3-20bf-4931-868d-a20ad433cc81/manager/0.log" Dec 04 11:22:20 crc kubenswrapper[4693]: I1204 11:22:20.185813 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-zg6gn_9b4ce9a3-bc13-4726-af72-c0f4c619efec/kube-rbac-proxy/0.log" Dec 04 11:22:20 crc kubenswrapper[4693]: I1204 11:22:20.332156 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-zg6gn_9b4ce9a3-bc13-4726-af72-c0f4c619efec/manager/0.log" Dec 04 11:22:20 crc kubenswrapper[4693]: I1204 11:22:20.391960 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7_287b0c68-a203-4af6-b654-2eb97b004cdc/manager/0.log" Dec 04 11:22:20 crc kubenswrapper[4693]: I1204 11:22:20.429730 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd47pjr7_287b0c68-a203-4af6-b654-2eb97b004cdc/kube-rbac-proxy/0.log" Dec 04 11:22:20 crc kubenswrapper[4693]: I1204 11:22:20.811313 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-shhvj_f908e4a3-f7a8-4165-b9b1-8b25f43727e1/registry-server/0.log" Dec 04 11:22:20 crc kubenswrapper[4693]: I1204 11:22:20.918812 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5959575f68-6dnpv_4c440aca-49fc-4f5b-8890-d2b8c021febf/operator/0.log" Dec 04 11:22:21 crc kubenswrapper[4693]: I1204 11:22:21.102923 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-cxrdm_b352e856-6946-41ed-8d06-46b1ab00185e/kube-rbac-proxy/0.log" Dec 04 11:22:21 crc kubenswrapper[4693]: I1204 11:22:21.303175 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-cxrdm_b352e856-6946-41ed-8d06-46b1ab00185e/manager/0.log" Dec 04 11:22:21 crc kubenswrapper[4693]: I1204 11:22:21.324505 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-4zbtc_75a5a37b-eb32-4654-85f7-1c7b9de1c247/kube-rbac-proxy/0.log" Dec 04 11:22:21 crc kubenswrapper[4693]: I1204 11:22:21.472762 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-4zbtc_75a5a37b-eb32-4654-85f7-1c7b9de1c247/manager/0.log" Dec 04 11:22:21 crc kubenswrapper[4693]: I1204 11:22:21.525578 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xzz9b_4ffbc9ab-625c-467a-b3cb-017b4167d8a1/operator/0.log" Dec 04 11:22:21 crc kubenswrapper[4693]: I1204 11:22:21.757315 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-gfvq4_23baa4a2-ca26-41a5-968c-f642ca80d1fa/kube-rbac-proxy/0.log" Dec 04 11:22:21 crc kubenswrapper[4693]: I1204 11:22:21.773005 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-gfvq4_23baa4a2-ca26-41a5-968c-f642ca80d1fa/manager/0.log" Dec 04 11:22:21 crc kubenswrapper[4693]: I1204 11:22:21.863175 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-zmpkb_a37bfc80-1ecc-4547-8fbe-be223b9a5cc2/kube-rbac-proxy/0.log" Dec 04 11:22:21 crc kubenswrapper[4693]: I1204 11:22:21.903026 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5bf9d46bf4-jn6kv_9b4532d5-fce3-43a3-b72c-c0752eae7945/manager/0.log" Dec 04 11:22:22 crc kubenswrapper[4693]: I1204 11:22:22.051452 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-zmpkb_a37bfc80-1ecc-4547-8fbe-be223b9a5cc2/manager/0.log" Dec 04 11:22:22 crc kubenswrapper[4693]: I1204 11:22:22.063387 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-vfrzv_b2d582c6-b444-4591-93c3-7681714732bc/kube-rbac-proxy/0.log" Dec 04 11:22:22 crc kubenswrapper[4693]: I1204 11:22:22.096296 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-vfrzv_b2d582c6-b444-4591-93c3-7681714732bc/manager/0.log" Dec 04 11:22:22 crc kubenswrapper[4693]: I1204 11:22:22.183611 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-pzpj7_383c7650-d095-4996-88c6-06d999b1973b/kube-rbac-proxy/0.log" Dec 04 11:22:22 crc kubenswrapper[4693]: I1204 11:22:22.254631 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-pzpj7_383c7650-d095-4996-88c6-06d999b1973b/manager/0.log" Dec 04 11:22:29 crc kubenswrapper[4693]: I1204 11:22:29.462466 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:22:29 crc kubenswrapper[4693]: E1204 11:22:29.463209 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:22:42 crc kubenswrapper[4693]: I1204 11:22:42.488461 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-n9k5f_3e92c71e-1bcc-455f-a270-1dd051662af6/control-plane-machine-set-operator/0.log" Dec 04 11:22:42 crc kubenswrapper[4693]: I1204 11:22:42.682301 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gh7dl_46a329a4-a450-4e39-bcbe-c7dcba1e6939/machine-api-operator/0.log" Dec 04 11:22:42 crc kubenswrapper[4693]: I1204 11:22:42.683851 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gh7dl_46a329a4-a450-4e39-bcbe-c7dcba1e6939/kube-rbac-proxy/0.log" Dec 04 11:22:43 crc kubenswrapper[4693]: I1204 11:22:43.460936 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:22:43 crc kubenswrapper[4693]: E1204 11:22:43.461637 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:22:55 crc kubenswrapper[4693]: I1204 11:22:55.578310 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-qc5kn_e8fb930d-7df9-4d8f-8edf-e5ae3ef734ce/cert-manager-controller/0.log" Dec 04 11:22:55 crc kubenswrapper[4693]: I1204 11:22:55.741043 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-qn5sx_5f2c58ea-f0fb-4460-9794-64d3182b3b5f/cert-manager-cainjector/0.log" Dec 04 11:22:55 crc kubenswrapper[4693]: I1204 11:22:55.813205 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-9zv9m_df9e8e44-fcb6-48e4-abc7-cb16efbb64bd/cert-manager-webhook/0.log" Dec 04 11:22:57 crc kubenswrapper[4693]: I1204 11:22:57.462754 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:22:57 crc kubenswrapper[4693]: E1204 11:22:57.463091 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:23:08 crc kubenswrapper[4693]: I1204 11:23:08.780189 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-6tjzs_04395eb3-d17a-45e1-8c76-5cef70217095/nmstate-console-plugin/0.log" Dec 04 11:23:08 crc kubenswrapper[4693]: I1204 11:23:08.931703 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-897g8_3ce4524c-acc8-42ff-b575-712649f91f33/nmstate-handler/0.log" Dec 04 11:23:09 crc kubenswrapper[4693]: I1204 11:23:09.027289 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-qvq5w_5054f7b0-a486-497e-9514-4a1387e7f815/kube-rbac-proxy/0.log" Dec 04 11:23:09 crc kubenswrapper[4693]: I1204 11:23:09.059417 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-qvq5w_5054f7b0-a486-497e-9514-4a1387e7f815/nmstate-metrics/0.log" Dec 04 11:23:09 crc kubenswrapper[4693]: I1204 11:23:09.238929 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-6q6qg_92bcc38e-785b-41c0-9bf0-60db671cf71c/nmstate-operator/0.log" Dec 04 11:23:09 crc kubenswrapper[4693]: I1204 11:23:09.269711 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-w2kzc_a3bf9b17-2a64-4697-b916-16b8c14f4bff/nmstate-webhook/0.log" Dec 04 11:23:12 crc kubenswrapper[4693]: I1204 11:23:12.461225 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:23:12 crc kubenswrapper[4693]: E1204 11:23:12.463065 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:23:23 crc kubenswrapper[4693]: I1204 11:23:23.590474 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-tvgv5_05ad9856-cd9f-4317-8c24-ebfa61baa56b/kube-rbac-proxy/0.log" Dec 04 11:23:23 crc kubenswrapper[4693]: I1204 11:23:23.778720 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-tvgv5_05ad9856-cd9f-4317-8c24-ebfa61baa56b/controller/0.log" Dec 04 11:23:23 crc kubenswrapper[4693]: I1204 11:23:23.864167 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-frr-files/0.log" Dec 04 11:23:24 crc kubenswrapper[4693]: I1204 11:23:24.074146 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-metrics/0.log" Dec 04 11:23:24 crc kubenswrapper[4693]: I1204 11:23:24.075018 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-reloader/0.log" Dec 04 11:23:24 crc kubenswrapper[4693]: I1204 11:23:24.085941 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-reloader/0.log" Dec 04 11:23:24 crc kubenswrapper[4693]: I1204 11:23:24.105653 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-frr-files/0.log" Dec 04 11:23:24 crc kubenswrapper[4693]: I1204 11:23:24.351121 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-frr-files/0.log" Dec 04 11:23:24 crc kubenswrapper[4693]: I1204 11:23:24.447849 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-metrics/0.log" Dec 04 11:23:24 crc kubenswrapper[4693]: I1204 11:23:24.458069 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-reloader/0.log" Dec 04 11:23:24 crc kubenswrapper[4693]: I1204 11:23:24.479575 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-metrics/0.log" Dec 04 11:23:24 crc kubenswrapper[4693]: I1204 11:23:24.664776 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/controller/0.log" Dec 04 11:23:24 crc kubenswrapper[4693]: I1204 11:23:24.668936 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-frr-files/0.log" Dec 04 11:23:24 crc kubenswrapper[4693]: I1204 11:23:24.671516 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-reloader/0.log" Dec 04 11:23:24 crc kubenswrapper[4693]: I1204 11:23:24.682981 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/cp-metrics/0.log" Dec 04 11:23:24 crc kubenswrapper[4693]: I1204 11:23:24.841736 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/frr-metrics/0.log" Dec 04 11:23:24 crc kubenswrapper[4693]: I1204 11:23:24.849921 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/kube-rbac-proxy/0.log" Dec 04 11:23:24 crc kubenswrapper[4693]: I1204 11:23:24.856369 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/kube-rbac-proxy-frr/0.log" Dec 04 11:23:25 crc kubenswrapper[4693]: I1204 11:23:25.082701 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/reloader/0.log" Dec 04 11:23:25 crc kubenswrapper[4693]: I1204 11:23:25.108219 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-qdqd6_f57bca9c-8036-4ef7-8c6c-dcaa42b44c3a/frr-k8s-webhook-server/0.log" Dec 04 11:23:25 crc kubenswrapper[4693]: I1204 11:23:25.377316 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-cfc67b4f5-6t7cx_990d09c4-a9e4-4233-ae12-3910f2937270/manager/0.log" Dec 04 11:23:25 crc kubenswrapper[4693]: I1204 11:23:25.460966 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:23:25 crc kubenswrapper[4693]: E1204 11:23:25.461214 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:23:25 crc kubenswrapper[4693]: I1204 11:23:25.496394 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5f644b44db-65csk_d7810505-b15b-4970-8cf0-f7217394a1ca/webhook-server/0.log" Dec 04 11:23:25 crc kubenswrapper[4693]: I1204 11:23:25.614239 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-f7pnf_8cb6de38-296b-415b-8f7c-aa037586a5db/kube-rbac-proxy/0.log" Dec 04 11:23:26 crc kubenswrapper[4693]: I1204 11:23:26.221355 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-f7pnf_8cb6de38-296b-415b-8f7c-aa037586a5db/speaker/0.log" Dec 04 11:23:26 crc kubenswrapper[4693]: I1204 11:23:26.576721 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rc5g6_1455f336-f228-46d6-b944-4c76aa652335/frr/0.log" Dec 04 11:23:37 crc kubenswrapper[4693]: I1204 11:23:37.997362 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf_e2ff9719-16f5-410e-af26-8dadc807ca7e/util/0.log" Dec 04 11:23:38 crc kubenswrapper[4693]: I1204 11:23:38.199301 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf_e2ff9719-16f5-410e-af26-8dadc807ca7e/pull/0.log" Dec 04 11:23:38 crc kubenswrapper[4693]: I1204 11:23:38.236423 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf_e2ff9719-16f5-410e-af26-8dadc807ca7e/pull/0.log" Dec 04 11:23:38 crc kubenswrapper[4693]: I1204 11:23:38.238953 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf_e2ff9719-16f5-410e-af26-8dadc807ca7e/util/0.log" Dec 04 11:23:38 crc kubenswrapper[4693]: I1204 11:23:38.424871 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf_e2ff9719-16f5-410e-af26-8dadc807ca7e/pull/0.log" Dec 04 11:23:38 crc kubenswrapper[4693]: I1204 11:23:38.453776 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf_e2ff9719-16f5-410e-af26-8dadc807ca7e/util/0.log" Dec 04 11:23:38 crc kubenswrapper[4693]: I1204 11:23:38.482899 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnfblf_e2ff9719-16f5-410e-af26-8dadc807ca7e/extract/0.log" Dec 04 11:23:38 crc kubenswrapper[4693]: I1204 11:23:38.608812 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw_b6e8fbed-f202-40ea-ad04-5bbbb338f8e1/util/0.log" Dec 04 11:23:38 crc kubenswrapper[4693]: I1204 11:23:38.759081 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw_b6e8fbed-f202-40ea-ad04-5bbbb338f8e1/util/0.log" Dec 04 11:23:38 crc kubenswrapper[4693]: I1204 11:23:38.792747 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw_b6e8fbed-f202-40ea-ad04-5bbbb338f8e1/pull/0.log" Dec 04 11:23:38 crc kubenswrapper[4693]: I1204 11:23:38.793208 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw_b6e8fbed-f202-40ea-ad04-5bbbb338f8e1/pull/0.log" Dec 04 11:23:38 crc kubenswrapper[4693]: I1204 11:23:38.970232 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw_b6e8fbed-f202-40ea-ad04-5bbbb338f8e1/util/0.log" Dec 04 11:23:38 crc kubenswrapper[4693]: I1204 11:23:38.993894 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw_b6e8fbed-f202-40ea-ad04-5bbbb338f8e1/pull/0.log" Dec 04 11:23:39 crc kubenswrapper[4693]: I1204 11:23:39.002831 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p8chw_b6e8fbed-f202-40ea-ad04-5bbbb338f8e1/extract/0.log" Dec 04 11:23:39 crc kubenswrapper[4693]: I1204 11:23:39.169580 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xd4kh_96d24e68-be12-479a-8354-d30848a5b7e1/extract-utilities/0.log" Dec 04 11:23:39 crc kubenswrapper[4693]: I1204 11:23:39.324492 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xd4kh_96d24e68-be12-479a-8354-d30848a5b7e1/extract-utilities/0.log" Dec 04 11:23:39 crc kubenswrapper[4693]: I1204 11:23:39.327181 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xd4kh_96d24e68-be12-479a-8354-d30848a5b7e1/extract-content/0.log" Dec 04 11:23:39 crc kubenswrapper[4693]: I1204 11:23:39.338387 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xd4kh_96d24e68-be12-479a-8354-d30848a5b7e1/extract-content/0.log" Dec 04 11:23:39 crc kubenswrapper[4693]: I1204 11:23:39.461642 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:23:39 crc kubenswrapper[4693]: E1204 11:23:39.462003 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:23:39 crc kubenswrapper[4693]: I1204 11:23:39.487222 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xd4kh_96d24e68-be12-479a-8354-d30848a5b7e1/extract-utilities/0.log" Dec 04 11:23:39 crc kubenswrapper[4693]: I1204 11:23:39.490204 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xd4kh_96d24e68-be12-479a-8354-d30848a5b7e1/extract-content/0.log" Dec 04 11:23:39 crc kubenswrapper[4693]: I1204 11:23:39.703706 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mm9jb_12ac24d7-f1d6-48c9-95cb-ac54e898ad99/extract-utilities/0.log" Dec 04 11:23:40 crc kubenswrapper[4693]: I1204 11:23:40.027995 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mm9jb_12ac24d7-f1d6-48c9-95cb-ac54e898ad99/extract-utilities/0.log" Dec 04 11:23:40 crc kubenswrapper[4693]: I1204 11:23:40.037738 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mm9jb_12ac24d7-f1d6-48c9-95cb-ac54e898ad99/extract-content/0.log" Dec 04 11:23:40 crc kubenswrapper[4693]: I1204 11:23:40.037880 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mm9jb_12ac24d7-f1d6-48c9-95cb-ac54e898ad99/extract-content/0.log" Dec 04 11:23:40 crc kubenswrapper[4693]: I1204 11:23:40.259520 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mm9jb_12ac24d7-f1d6-48c9-95cb-ac54e898ad99/extract-content/0.log" Dec 04 11:23:40 crc kubenswrapper[4693]: I1204 11:23:40.263516 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mm9jb_12ac24d7-f1d6-48c9-95cb-ac54e898ad99/extract-utilities/0.log" Dec 04 11:23:40 crc kubenswrapper[4693]: I1204 11:23:40.266129 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xd4kh_96d24e68-be12-479a-8354-d30848a5b7e1/registry-server/0.log" Dec 04 11:23:40 crc kubenswrapper[4693]: I1204 11:23:40.514844 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-26frc_57e8fd24-01fe-42d0-9bd6-6066003c724b/marketplace-operator/0.log" Dec 04 11:23:40 crc kubenswrapper[4693]: I1204 11:23:40.720763 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bt2sw_07fa793f-d0d3-469c-8110-cd6e594d40b9/extract-utilities/0.log" Dec 04 11:23:40 crc kubenswrapper[4693]: I1204 11:23:40.904880 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bt2sw_07fa793f-d0d3-469c-8110-cd6e594d40b9/extract-content/0.log" Dec 04 11:23:40 crc kubenswrapper[4693]: I1204 11:23:40.927467 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bt2sw_07fa793f-d0d3-469c-8110-cd6e594d40b9/extract-utilities/0.log" Dec 04 11:23:40 crc kubenswrapper[4693]: I1204 11:23:40.929404 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bt2sw_07fa793f-d0d3-469c-8110-cd6e594d40b9/extract-content/0.log" Dec 04 11:23:41 crc kubenswrapper[4693]: I1204 11:23:41.245261 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bt2sw_07fa793f-d0d3-469c-8110-cd6e594d40b9/extract-utilities/0.log" Dec 04 11:23:41 crc kubenswrapper[4693]: I1204 11:23:41.258392 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mm9jb_12ac24d7-f1d6-48c9-95cb-ac54e898ad99/registry-server/0.log" Dec 04 11:23:41 crc kubenswrapper[4693]: I1204 11:23:41.294399 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bt2sw_07fa793f-d0d3-469c-8110-cd6e594d40b9/extract-content/0.log" Dec 04 11:23:41 crc kubenswrapper[4693]: I1204 11:23:41.486252 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bt2sw_07fa793f-d0d3-469c-8110-cd6e594d40b9/registry-server/0.log" Dec 04 11:23:41 crc kubenswrapper[4693]: I1204 11:23:41.508592 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zh65d_8ec6de5e-a0e3-440b-be6e-336aef2a5a24/extract-utilities/0.log" Dec 04 11:23:41 crc kubenswrapper[4693]: I1204 11:23:41.692204 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zh65d_8ec6de5e-a0e3-440b-be6e-336aef2a5a24/extract-content/0.log" Dec 04 11:23:41 crc kubenswrapper[4693]: I1204 11:23:41.711089 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zh65d_8ec6de5e-a0e3-440b-be6e-336aef2a5a24/extract-content/0.log" Dec 04 11:23:41 crc kubenswrapper[4693]: I1204 11:23:41.713478 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zh65d_8ec6de5e-a0e3-440b-be6e-336aef2a5a24/extract-utilities/0.log" Dec 04 11:23:41 crc kubenswrapper[4693]: I1204 11:23:41.864436 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zh65d_8ec6de5e-a0e3-440b-be6e-336aef2a5a24/extract-content/0.log" Dec 04 11:23:41 crc kubenswrapper[4693]: I1204 11:23:41.896452 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zh65d_8ec6de5e-a0e3-440b-be6e-336aef2a5a24/extract-utilities/0.log" Dec 04 11:23:42 crc kubenswrapper[4693]: I1204 11:23:42.638100 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zh65d_8ec6de5e-a0e3-440b-be6e-336aef2a5a24/registry-server/0.log" Dec 04 11:23:52 crc kubenswrapper[4693]: I1204 11:23:52.462119 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:23:52 crc kubenswrapper[4693]: E1204 11:23:52.462970 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:24:06 crc kubenswrapper[4693]: I1204 11:24:06.460792 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:24:06 crc kubenswrapper[4693]: E1204 11:24:06.461623 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:24:21 crc kubenswrapper[4693]: I1204 11:24:21.461876 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:24:21 crc kubenswrapper[4693]: E1204 11:24:21.462809 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:24:32 crc kubenswrapper[4693]: I1204 11:24:32.465086 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:24:32 crc kubenswrapper[4693]: E1204 11:24:32.465862 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:24:43 crc kubenswrapper[4693]: I1204 11:24:43.508156 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:24:43 crc kubenswrapper[4693]: E1204 11:24:43.508908 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:24:58 crc kubenswrapper[4693]: I1204 11:24:58.462315 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:24:58 crc kubenswrapper[4693]: E1204 11:24:58.463232 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:25:09 crc kubenswrapper[4693]: I1204 11:25:09.461458 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:25:09 crc kubenswrapper[4693]: E1204 11:25:09.462476 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:25:24 crc kubenswrapper[4693]: I1204 11:25:24.472455 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:25:24 crc kubenswrapper[4693]: E1204 11:25:24.473173 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:25:36 crc kubenswrapper[4693]: I1204 11:25:36.461865 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:25:36 crc kubenswrapper[4693]: E1204 11:25:36.462818 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:25:49 crc kubenswrapper[4693]: I1204 11:25:49.462088 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:25:49 crc kubenswrapper[4693]: E1204 11:25:49.462912 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sgn9x_openshift-machine-config-operator(d4f65408-7d18-47db-8a19-f9be435dd348)\"" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" Dec 04 11:25:51 crc kubenswrapper[4693]: I1204 11:25:51.587870 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t4pwq"] Dec 04 11:25:51 crc kubenswrapper[4693]: E1204 11:25:51.588671 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5059293d-f8da-4487-9b36-a7ca7aead90c" containerName="container-00" Dec 04 11:25:51 crc kubenswrapper[4693]: I1204 11:25:51.588689 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5059293d-f8da-4487-9b36-a7ca7aead90c" containerName="container-00" Dec 04 11:25:51 crc kubenswrapper[4693]: I1204 11:25:51.589118 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="5059293d-f8da-4487-9b36-a7ca7aead90c" containerName="container-00" Dec 04 11:25:51 crc kubenswrapper[4693]: I1204 11:25:51.590822 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t4pwq" Dec 04 11:25:51 crc kubenswrapper[4693]: I1204 11:25:51.598557 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4pwq"] Dec 04 11:25:51 crc kubenswrapper[4693]: I1204 11:25:51.726484 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br9q2\" (UniqueName: \"kubernetes.io/projected/8baab228-31e7-4247-bd51-ea47ab5247bf-kube-api-access-br9q2\") pod \"redhat-marketplace-t4pwq\" (UID: \"8baab228-31e7-4247-bd51-ea47ab5247bf\") " pod="openshift-marketplace/redhat-marketplace-t4pwq" Dec 04 11:25:51 crc kubenswrapper[4693]: I1204 11:25:51.726632 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8baab228-31e7-4247-bd51-ea47ab5247bf-catalog-content\") pod \"redhat-marketplace-t4pwq\" (UID: \"8baab228-31e7-4247-bd51-ea47ab5247bf\") " pod="openshift-marketplace/redhat-marketplace-t4pwq" Dec 04 11:25:51 crc kubenswrapper[4693]: I1204 11:25:51.726657 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8baab228-31e7-4247-bd51-ea47ab5247bf-utilities\") pod \"redhat-marketplace-t4pwq\" (UID: \"8baab228-31e7-4247-bd51-ea47ab5247bf\") " pod="openshift-marketplace/redhat-marketplace-t4pwq" Dec 04 11:25:51 crc kubenswrapper[4693]: I1204 11:25:51.828037 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br9q2\" (UniqueName: \"kubernetes.io/projected/8baab228-31e7-4247-bd51-ea47ab5247bf-kube-api-access-br9q2\") pod \"redhat-marketplace-t4pwq\" (UID: \"8baab228-31e7-4247-bd51-ea47ab5247bf\") " pod="openshift-marketplace/redhat-marketplace-t4pwq" Dec 04 11:25:51 crc kubenswrapper[4693]: I1204 11:25:51.828219 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8baab228-31e7-4247-bd51-ea47ab5247bf-catalog-content\") pod \"redhat-marketplace-t4pwq\" (UID: \"8baab228-31e7-4247-bd51-ea47ab5247bf\") " pod="openshift-marketplace/redhat-marketplace-t4pwq" Dec 04 11:25:51 crc kubenswrapper[4693]: I1204 11:25:51.828243 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8baab228-31e7-4247-bd51-ea47ab5247bf-utilities\") pod \"redhat-marketplace-t4pwq\" (UID: \"8baab228-31e7-4247-bd51-ea47ab5247bf\") " pod="openshift-marketplace/redhat-marketplace-t4pwq" Dec 04 11:25:51 crc kubenswrapper[4693]: I1204 11:25:51.828645 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8baab228-31e7-4247-bd51-ea47ab5247bf-catalog-content\") pod \"redhat-marketplace-t4pwq\" (UID: \"8baab228-31e7-4247-bd51-ea47ab5247bf\") " pod="openshift-marketplace/redhat-marketplace-t4pwq" Dec 04 11:25:51 crc kubenswrapper[4693]: I1204 11:25:51.828664 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8baab228-31e7-4247-bd51-ea47ab5247bf-utilities\") pod \"redhat-marketplace-t4pwq\" (UID: \"8baab228-31e7-4247-bd51-ea47ab5247bf\") " pod="openshift-marketplace/redhat-marketplace-t4pwq" Dec 04 11:25:51 crc kubenswrapper[4693]: I1204 11:25:51.864070 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br9q2\" (UniqueName: \"kubernetes.io/projected/8baab228-31e7-4247-bd51-ea47ab5247bf-kube-api-access-br9q2\") pod \"redhat-marketplace-t4pwq\" (UID: \"8baab228-31e7-4247-bd51-ea47ab5247bf\") " pod="openshift-marketplace/redhat-marketplace-t4pwq" Dec 04 11:25:51 crc kubenswrapper[4693]: I1204 11:25:51.915687 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t4pwq" Dec 04 11:25:52 crc kubenswrapper[4693]: I1204 11:25:52.266690 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4pwq"] Dec 04 11:25:52 crc kubenswrapper[4693]: I1204 11:25:52.881186 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4pwq" event={"ID":"8baab228-31e7-4247-bd51-ea47ab5247bf","Type":"ContainerStarted","Data":"6303914aa486dbe33038738cdc2e2af89567d37213a17b0479ee8e17315d5cd7"} Dec 04 11:25:52 crc kubenswrapper[4693]: I1204 11:25:52.882276 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4pwq" event={"ID":"8baab228-31e7-4247-bd51-ea47ab5247bf","Type":"ContainerStarted","Data":"e6881eeaf3ca508d3d5037e5acc72af8d72b83710cabe24ad0021ba0b9a2bd83"} Dec 04 11:25:53 crc kubenswrapper[4693]: I1204 11:25:53.891654 4693 generic.go:334] "Generic (PLEG): container finished" podID="8baab228-31e7-4247-bd51-ea47ab5247bf" containerID="6303914aa486dbe33038738cdc2e2af89567d37213a17b0479ee8e17315d5cd7" exitCode=0 Dec 04 11:25:53 crc kubenswrapper[4693]: I1204 11:25:53.891857 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4pwq" event={"ID":"8baab228-31e7-4247-bd51-ea47ab5247bf","Type":"ContainerDied","Data":"6303914aa486dbe33038738cdc2e2af89567d37213a17b0479ee8e17315d5cd7"} Dec 04 11:25:53 crc kubenswrapper[4693]: I1204 11:25:53.897200 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 11:25:54 crc kubenswrapper[4693]: I1204 11:25:54.902301 4693 generic.go:334] "Generic (PLEG): container finished" podID="8baab228-31e7-4247-bd51-ea47ab5247bf" containerID="ff791cfe8d0caf5d618f479f572ab2b35878470af80e5391932b8e2d49c05bca" exitCode=0 Dec 04 11:25:54 crc kubenswrapper[4693]: I1204 11:25:54.902414 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4pwq" event={"ID":"8baab228-31e7-4247-bd51-ea47ab5247bf","Type":"ContainerDied","Data":"ff791cfe8d0caf5d618f479f572ab2b35878470af80e5391932b8e2d49c05bca"} Dec 04 11:25:55 crc kubenswrapper[4693]: I1204 11:25:55.912721 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4pwq" event={"ID":"8baab228-31e7-4247-bd51-ea47ab5247bf","Type":"ContainerStarted","Data":"dbe5df72c0813d17cee3e8b28c14ddd0218850c6dcb7eb6e1f5ef13dfcfc4144"} Dec 04 11:25:55 crc kubenswrapper[4693]: I1204 11:25:55.932308 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t4pwq" podStartSLOduration=3.502593809 podStartE2EDuration="4.932290028s" podCreationTimestamp="2025-12-04 11:25:51 +0000 UTC" firstStartedPulling="2025-12-04 11:25:53.896923766 +0000 UTC m=+6199.794517519" lastFinishedPulling="2025-12-04 11:25:55.326619995 +0000 UTC m=+6201.224213738" observedRunningTime="2025-12-04 11:25:55.93011381 +0000 UTC m=+6201.827707563" watchObservedRunningTime="2025-12-04 11:25:55.932290028 +0000 UTC m=+6201.829883781" Dec 04 11:25:58 crc kubenswrapper[4693]: I1204 11:25:58.942665 4693 generic.go:334] "Generic (PLEG): container finished" podID="96fe5a48-f46f-4397-973d-87b60628ba4b" containerID="0e4f387a916af6cfa46c9e0f801f517f0450e99cf08de41bcf48df40600940be" exitCode=0 Dec 04 11:25:58 crc kubenswrapper[4693]: I1204 11:25:58.942771 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fkp7/must-gather-km9fd" event={"ID":"96fe5a48-f46f-4397-973d-87b60628ba4b","Type":"ContainerDied","Data":"0e4f387a916af6cfa46c9e0f801f517f0450e99cf08de41bcf48df40600940be"} Dec 04 11:25:58 crc kubenswrapper[4693]: I1204 11:25:58.944931 4693 scope.go:117] "RemoveContainer" containerID="0e4f387a916af6cfa46c9e0f801f517f0450e99cf08de41bcf48df40600940be" Dec 04 11:25:59 crc kubenswrapper[4693]: I1204 11:25:59.401542 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9fkp7_must-gather-km9fd_96fe5a48-f46f-4397-973d-87b60628ba4b/gather/0.log" Dec 04 11:26:01 crc kubenswrapper[4693]: I1204 11:26:01.917476 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t4pwq" Dec 04 11:26:01 crc kubenswrapper[4693]: I1204 11:26:01.918039 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t4pwq" Dec 04 11:26:01 crc kubenswrapper[4693]: I1204 11:26:01.969703 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t4pwq" Dec 04 11:26:02 crc kubenswrapper[4693]: I1204 11:26:02.024497 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t4pwq" Dec 04 11:26:02 crc kubenswrapper[4693]: I1204 11:26:02.207255 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4pwq"] Dec 04 11:26:03 crc kubenswrapper[4693]: I1204 11:26:03.462465 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:26:03 crc kubenswrapper[4693]: I1204 11:26:03.991884 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"015dd98e9a7b419149c24b71c82036f582afd591210d5f29b29b23133208656a"} Dec 04 11:26:03 crc kubenswrapper[4693]: I1204 11:26:03.992049 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t4pwq" podUID="8baab228-31e7-4247-bd51-ea47ab5247bf" containerName="registry-server" containerID="cri-o://dbe5df72c0813d17cee3e8b28c14ddd0218850c6dcb7eb6e1f5ef13dfcfc4144" gracePeriod=2 Dec 04 11:26:04 crc kubenswrapper[4693]: I1204 11:26:04.492061 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t4pwq" Dec 04 11:26:04 crc kubenswrapper[4693]: I1204 11:26:04.532486 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8baab228-31e7-4247-bd51-ea47ab5247bf-catalog-content\") pod \"8baab228-31e7-4247-bd51-ea47ab5247bf\" (UID: \"8baab228-31e7-4247-bd51-ea47ab5247bf\") " Dec 04 11:26:04 crc kubenswrapper[4693]: I1204 11:26:04.532574 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8baab228-31e7-4247-bd51-ea47ab5247bf-utilities\") pod \"8baab228-31e7-4247-bd51-ea47ab5247bf\" (UID: \"8baab228-31e7-4247-bd51-ea47ab5247bf\") " Dec 04 11:26:04 crc kubenswrapper[4693]: I1204 11:26:04.532642 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br9q2\" (UniqueName: \"kubernetes.io/projected/8baab228-31e7-4247-bd51-ea47ab5247bf-kube-api-access-br9q2\") pod \"8baab228-31e7-4247-bd51-ea47ab5247bf\" (UID: \"8baab228-31e7-4247-bd51-ea47ab5247bf\") " Dec 04 11:26:04 crc kubenswrapper[4693]: I1204 11:26:04.535290 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8baab228-31e7-4247-bd51-ea47ab5247bf-utilities" (OuterVolumeSpecName: "utilities") pod "8baab228-31e7-4247-bd51-ea47ab5247bf" (UID: "8baab228-31e7-4247-bd51-ea47ab5247bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:26:04 crc kubenswrapper[4693]: I1204 11:26:04.557520 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8baab228-31e7-4247-bd51-ea47ab5247bf-kube-api-access-br9q2" (OuterVolumeSpecName: "kube-api-access-br9q2") pod "8baab228-31e7-4247-bd51-ea47ab5247bf" (UID: "8baab228-31e7-4247-bd51-ea47ab5247bf"). InnerVolumeSpecName "kube-api-access-br9q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:26:04 crc kubenswrapper[4693]: I1204 11:26:04.569174 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8baab228-31e7-4247-bd51-ea47ab5247bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8baab228-31e7-4247-bd51-ea47ab5247bf" (UID: "8baab228-31e7-4247-bd51-ea47ab5247bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:26:04 crc kubenswrapper[4693]: I1204 11:26:04.634262 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8baab228-31e7-4247-bd51-ea47ab5247bf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:26:04 crc kubenswrapper[4693]: I1204 11:26:04.634747 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8baab228-31e7-4247-bd51-ea47ab5247bf-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:26:04 crc kubenswrapper[4693]: I1204 11:26:04.634765 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br9q2\" (UniqueName: \"kubernetes.io/projected/8baab228-31e7-4247-bd51-ea47ab5247bf-kube-api-access-br9q2\") on node \"crc\" DevicePath \"\"" Dec 04 11:26:05 crc kubenswrapper[4693]: I1204 11:26:05.002208 4693 generic.go:334] "Generic (PLEG): container finished" podID="8baab228-31e7-4247-bd51-ea47ab5247bf" containerID="dbe5df72c0813d17cee3e8b28c14ddd0218850c6dcb7eb6e1f5ef13dfcfc4144" exitCode=0 Dec 04 11:26:05 crc kubenswrapper[4693]: I1204 11:26:05.002260 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4pwq" event={"ID":"8baab228-31e7-4247-bd51-ea47ab5247bf","Type":"ContainerDied","Data":"dbe5df72c0813d17cee3e8b28c14ddd0218850c6dcb7eb6e1f5ef13dfcfc4144"} Dec 04 11:26:05 crc kubenswrapper[4693]: I1204 11:26:05.002296 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4pwq" event={"ID":"8baab228-31e7-4247-bd51-ea47ab5247bf","Type":"ContainerDied","Data":"e6881eeaf3ca508d3d5037e5acc72af8d72b83710cabe24ad0021ba0b9a2bd83"} Dec 04 11:26:05 crc kubenswrapper[4693]: I1204 11:26:05.002301 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t4pwq" Dec 04 11:26:05 crc kubenswrapper[4693]: I1204 11:26:05.002318 4693 scope.go:117] "RemoveContainer" containerID="dbe5df72c0813d17cee3e8b28c14ddd0218850c6dcb7eb6e1f5ef13dfcfc4144" Dec 04 11:26:05 crc kubenswrapper[4693]: I1204 11:26:05.024581 4693 scope.go:117] "RemoveContainer" containerID="ff791cfe8d0caf5d618f479f572ab2b35878470af80e5391932b8e2d49c05bca" Dec 04 11:26:05 crc kubenswrapper[4693]: I1204 11:26:05.043888 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4pwq"] Dec 04 11:26:05 crc kubenswrapper[4693]: I1204 11:26:05.054985 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4pwq"] Dec 04 11:26:05 crc kubenswrapper[4693]: I1204 11:26:05.078504 4693 scope.go:117] "RemoveContainer" containerID="6303914aa486dbe33038738cdc2e2af89567d37213a17b0479ee8e17315d5cd7" Dec 04 11:26:05 crc kubenswrapper[4693]: I1204 11:26:05.098081 4693 scope.go:117] "RemoveContainer" containerID="dbe5df72c0813d17cee3e8b28c14ddd0218850c6dcb7eb6e1f5ef13dfcfc4144" Dec 04 11:26:05 crc kubenswrapper[4693]: E1204 11:26:05.099123 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbe5df72c0813d17cee3e8b28c14ddd0218850c6dcb7eb6e1f5ef13dfcfc4144\": container with ID starting with dbe5df72c0813d17cee3e8b28c14ddd0218850c6dcb7eb6e1f5ef13dfcfc4144 not found: ID does not exist" containerID="dbe5df72c0813d17cee3e8b28c14ddd0218850c6dcb7eb6e1f5ef13dfcfc4144" Dec 04 11:26:05 crc kubenswrapper[4693]: I1204 11:26:05.099159 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbe5df72c0813d17cee3e8b28c14ddd0218850c6dcb7eb6e1f5ef13dfcfc4144"} err="failed to get container status \"dbe5df72c0813d17cee3e8b28c14ddd0218850c6dcb7eb6e1f5ef13dfcfc4144\": rpc error: code = NotFound desc = could not find container \"dbe5df72c0813d17cee3e8b28c14ddd0218850c6dcb7eb6e1f5ef13dfcfc4144\": container with ID starting with dbe5df72c0813d17cee3e8b28c14ddd0218850c6dcb7eb6e1f5ef13dfcfc4144 not found: ID does not exist" Dec 04 11:26:05 crc kubenswrapper[4693]: I1204 11:26:05.099181 4693 scope.go:117] "RemoveContainer" containerID="ff791cfe8d0caf5d618f479f572ab2b35878470af80e5391932b8e2d49c05bca" Dec 04 11:26:05 crc kubenswrapper[4693]: E1204 11:26:05.099603 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff791cfe8d0caf5d618f479f572ab2b35878470af80e5391932b8e2d49c05bca\": container with ID starting with ff791cfe8d0caf5d618f479f572ab2b35878470af80e5391932b8e2d49c05bca not found: ID does not exist" containerID="ff791cfe8d0caf5d618f479f572ab2b35878470af80e5391932b8e2d49c05bca" Dec 04 11:26:05 crc kubenswrapper[4693]: I1204 11:26:05.099630 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff791cfe8d0caf5d618f479f572ab2b35878470af80e5391932b8e2d49c05bca"} err="failed to get container status \"ff791cfe8d0caf5d618f479f572ab2b35878470af80e5391932b8e2d49c05bca\": rpc error: code = NotFound desc = could not find container \"ff791cfe8d0caf5d618f479f572ab2b35878470af80e5391932b8e2d49c05bca\": container with ID starting with ff791cfe8d0caf5d618f479f572ab2b35878470af80e5391932b8e2d49c05bca not found: ID does not exist" Dec 04 11:26:05 crc kubenswrapper[4693]: I1204 11:26:05.099645 4693 scope.go:117] "RemoveContainer" containerID="6303914aa486dbe33038738cdc2e2af89567d37213a17b0479ee8e17315d5cd7" Dec 04 11:26:05 crc kubenswrapper[4693]: E1204 11:26:05.099997 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6303914aa486dbe33038738cdc2e2af89567d37213a17b0479ee8e17315d5cd7\": container with ID starting with 6303914aa486dbe33038738cdc2e2af89567d37213a17b0479ee8e17315d5cd7 not found: ID does not exist" containerID="6303914aa486dbe33038738cdc2e2af89567d37213a17b0479ee8e17315d5cd7" Dec 04 11:26:05 crc kubenswrapper[4693]: I1204 11:26:05.100030 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6303914aa486dbe33038738cdc2e2af89567d37213a17b0479ee8e17315d5cd7"} err="failed to get container status \"6303914aa486dbe33038738cdc2e2af89567d37213a17b0479ee8e17315d5cd7\": rpc error: code = NotFound desc = could not find container \"6303914aa486dbe33038738cdc2e2af89567d37213a17b0479ee8e17315d5cd7\": container with ID starting with 6303914aa486dbe33038738cdc2e2af89567d37213a17b0479ee8e17315d5cd7 not found: ID does not exist" Dec 04 11:26:06 crc kubenswrapper[4693]: I1204 11:26:06.474214 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8baab228-31e7-4247-bd51-ea47ab5247bf" path="/var/lib/kubelet/pods/8baab228-31e7-4247-bd51-ea47ab5247bf/volumes" Dec 04 11:26:10 crc kubenswrapper[4693]: I1204 11:26:10.941377 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9fkp7/must-gather-km9fd"] Dec 04 11:26:10 crc kubenswrapper[4693]: I1204 11:26:10.942136 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-9fkp7/must-gather-km9fd" podUID="96fe5a48-f46f-4397-973d-87b60628ba4b" containerName="copy" containerID="cri-o://a69a7486191de38c5e95a003a56205209dc33839186934e012194a59e125b244" gracePeriod=2 Dec 04 11:26:10 crc kubenswrapper[4693]: I1204 11:26:10.950171 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9fkp7/must-gather-km9fd"] Dec 04 11:26:11 crc kubenswrapper[4693]: I1204 11:26:11.437688 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9fkp7_must-gather-km9fd_96fe5a48-f46f-4397-973d-87b60628ba4b/copy/0.log" Dec 04 11:26:11 crc kubenswrapper[4693]: I1204 11:26:11.438436 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fkp7/must-gather-km9fd" Dec 04 11:26:11 crc kubenswrapper[4693]: I1204 11:26:11.582180 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/96fe5a48-f46f-4397-973d-87b60628ba4b-must-gather-output\") pod \"96fe5a48-f46f-4397-973d-87b60628ba4b\" (UID: \"96fe5a48-f46f-4397-973d-87b60628ba4b\") " Dec 04 11:26:11 crc kubenswrapper[4693]: I1204 11:26:11.582296 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq8mj\" (UniqueName: \"kubernetes.io/projected/96fe5a48-f46f-4397-973d-87b60628ba4b-kube-api-access-zq8mj\") pod \"96fe5a48-f46f-4397-973d-87b60628ba4b\" (UID: \"96fe5a48-f46f-4397-973d-87b60628ba4b\") " Dec 04 11:26:11 crc kubenswrapper[4693]: I1204 11:26:11.590583 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96fe5a48-f46f-4397-973d-87b60628ba4b-kube-api-access-zq8mj" (OuterVolumeSpecName: "kube-api-access-zq8mj") pod "96fe5a48-f46f-4397-973d-87b60628ba4b" (UID: "96fe5a48-f46f-4397-973d-87b60628ba4b"). InnerVolumeSpecName "kube-api-access-zq8mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:26:11 crc kubenswrapper[4693]: I1204 11:26:11.685275 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq8mj\" (UniqueName: \"kubernetes.io/projected/96fe5a48-f46f-4397-973d-87b60628ba4b-kube-api-access-zq8mj\") on node \"crc\" DevicePath \"\"" Dec 04 11:26:11 crc kubenswrapper[4693]: I1204 11:26:11.776191 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96fe5a48-f46f-4397-973d-87b60628ba4b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "96fe5a48-f46f-4397-973d-87b60628ba4b" (UID: "96fe5a48-f46f-4397-973d-87b60628ba4b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:26:11 crc kubenswrapper[4693]: I1204 11:26:11.787533 4693 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/96fe5a48-f46f-4397-973d-87b60628ba4b-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 04 11:26:12 crc kubenswrapper[4693]: I1204 11:26:12.090065 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9fkp7_must-gather-km9fd_96fe5a48-f46f-4397-973d-87b60628ba4b/copy/0.log" Dec 04 11:26:12 crc kubenswrapper[4693]: I1204 11:26:12.090529 4693 generic.go:334] "Generic (PLEG): container finished" podID="96fe5a48-f46f-4397-973d-87b60628ba4b" containerID="a69a7486191de38c5e95a003a56205209dc33839186934e012194a59e125b244" exitCode=143 Dec 04 11:26:12 crc kubenswrapper[4693]: I1204 11:26:12.090590 4693 scope.go:117] "RemoveContainer" containerID="a69a7486191de38c5e95a003a56205209dc33839186934e012194a59e125b244" Dec 04 11:26:12 crc kubenswrapper[4693]: I1204 11:26:12.090605 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fkp7/must-gather-km9fd" Dec 04 11:26:12 crc kubenswrapper[4693]: I1204 11:26:12.114458 4693 scope.go:117] "RemoveContainer" containerID="0e4f387a916af6cfa46c9e0f801f517f0450e99cf08de41bcf48df40600940be" Dec 04 11:26:12 crc kubenswrapper[4693]: I1204 11:26:12.187312 4693 scope.go:117] "RemoveContainer" containerID="a69a7486191de38c5e95a003a56205209dc33839186934e012194a59e125b244" Dec 04 11:26:12 crc kubenswrapper[4693]: E1204 11:26:12.187779 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a69a7486191de38c5e95a003a56205209dc33839186934e012194a59e125b244\": container with ID starting with a69a7486191de38c5e95a003a56205209dc33839186934e012194a59e125b244 not found: ID does not exist" containerID="a69a7486191de38c5e95a003a56205209dc33839186934e012194a59e125b244" Dec 04 11:26:12 crc kubenswrapper[4693]: I1204 11:26:12.187875 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a69a7486191de38c5e95a003a56205209dc33839186934e012194a59e125b244"} err="failed to get container status \"a69a7486191de38c5e95a003a56205209dc33839186934e012194a59e125b244\": rpc error: code = NotFound desc = could not find container \"a69a7486191de38c5e95a003a56205209dc33839186934e012194a59e125b244\": container with ID starting with a69a7486191de38c5e95a003a56205209dc33839186934e012194a59e125b244 not found: ID does not exist" Dec 04 11:26:12 crc kubenswrapper[4693]: I1204 11:26:12.187912 4693 scope.go:117] "RemoveContainer" containerID="0e4f387a916af6cfa46c9e0f801f517f0450e99cf08de41bcf48df40600940be" Dec 04 11:26:12 crc kubenswrapper[4693]: E1204 11:26:12.188370 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e4f387a916af6cfa46c9e0f801f517f0450e99cf08de41bcf48df40600940be\": container with ID starting with 0e4f387a916af6cfa46c9e0f801f517f0450e99cf08de41bcf48df40600940be not found: ID does not exist" containerID="0e4f387a916af6cfa46c9e0f801f517f0450e99cf08de41bcf48df40600940be" Dec 04 11:26:12 crc kubenswrapper[4693]: I1204 11:26:12.188441 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e4f387a916af6cfa46c9e0f801f517f0450e99cf08de41bcf48df40600940be"} err="failed to get container status \"0e4f387a916af6cfa46c9e0f801f517f0450e99cf08de41bcf48df40600940be\": rpc error: code = NotFound desc = could not find container \"0e4f387a916af6cfa46c9e0f801f517f0450e99cf08de41bcf48df40600940be\": container with ID starting with 0e4f387a916af6cfa46c9e0f801f517f0450e99cf08de41bcf48df40600940be not found: ID does not exist" Dec 04 11:26:12 crc kubenswrapper[4693]: I1204 11:26:12.472815 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96fe5a48-f46f-4397-973d-87b60628ba4b" path="/var/lib/kubelet/pods/96fe5a48-f46f-4397-973d-87b60628ba4b/volumes" Dec 04 11:27:13 crc kubenswrapper[4693]: I1204 11:27:13.034325 4693 scope.go:117] "RemoveContainer" containerID="1663c8986d5393fccb322ad97997696cdb89d3becf1418280c1dda8b4628b9f1" Dec 04 11:27:13 crc kubenswrapper[4693]: I1204 11:27:13.059515 4693 scope.go:117] "RemoveContainer" containerID="52a9f1f6c66ecb0f1aff1cd77ea7223a217922b350097760ae11166c815ac446" Dec 04 11:28:22 crc kubenswrapper[4693]: I1204 11:28:22.273514 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:28:22 crc kubenswrapper[4693]: I1204 11:28:22.274109 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:28:52 crc kubenswrapper[4693]: I1204 11:28:52.273141 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:28:52 crc kubenswrapper[4693]: I1204 11:28:52.273731 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:29:22 crc kubenswrapper[4693]: I1204 11:29:22.272633 4693 patch_prober.go:28] interesting pod/machine-config-daemon-sgn9x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:29:22 crc kubenswrapper[4693]: I1204 11:29:22.274651 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:29:22 crc kubenswrapper[4693]: I1204 11:29:22.274826 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" Dec 04 11:29:22 crc kubenswrapper[4693]: I1204 11:29:22.275738 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"015dd98e9a7b419149c24b71c82036f582afd591210d5f29b29b23133208656a"} pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 11:29:22 crc kubenswrapper[4693]: I1204 11:29:22.275905 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" podUID="d4f65408-7d18-47db-8a19-f9be435dd348" containerName="machine-config-daemon" containerID="cri-o://015dd98e9a7b419149c24b71c82036f582afd591210d5f29b29b23133208656a" gracePeriod=600 Dec 04 11:29:22 crc kubenswrapper[4693]: I1204 11:29:22.845449 4693 generic.go:334] "Generic (PLEG): container finished" podID="d4f65408-7d18-47db-8a19-f9be435dd348" containerID="015dd98e9a7b419149c24b71c82036f582afd591210d5f29b29b23133208656a" exitCode=0 Dec 04 11:29:22 crc kubenswrapper[4693]: I1204 11:29:22.845602 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerDied","Data":"015dd98e9a7b419149c24b71c82036f582afd591210d5f29b29b23133208656a"} Dec 04 11:29:22 crc kubenswrapper[4693]: I1204 11:29:22.845772 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sgn9x" event={"ID":"d4f65408-7d18-47db-8a19-f9be435dd348","Type":"ContainerStarted","Data":"c1a3cef53a2b066d59c2d7f578aa1fe44cf8823b3a9ca15aa2a9d38f99f06341"} Dec 04 11:29:22 crc kubenswrapper[4693]: I1204 11:29:22.845792 4693 scope.go:117] "RemoveContainer" containerID="a658c8753de9168bb389b081cb86a325c160ea8c4f4bfb51fa9941c25be9ffc5" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.151027 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414130-vl4n9"] Dec 04 11:30:00 crc kubenswrapper[4693]: E1204 11:30:00.152079 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baab228-31e7-4247-bd51-ea47ab5247bf" containerName="extract-content" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.152097 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baab228-31e7-4247-bd51-ea47ab5247bf" containerName="extract-content" Dec 04 11:30:00 crc kubenswrapper[4693]: E1204 11:30:00.152125 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baab228-31e7-4247-bd51-ea47ab5247bf" containerName="registry-server" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.152131 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baab228-31e7-4247-bd51-ea47ab5247bf" containerName="registry-server" Dec 04 11:30:00 crc kubenswrapper[4693]: E1204 11:30:00.152143 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96fe5a48-f46f-4397-973d-87b60628ba4b" containerName="gather" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.152149 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="96fe5a48-f46f-4397-973d-87b60628ba4b" containerName="gather" Dec 04 11:30:00 crc kubenswrapper[4693]: E1204 11:30:00.152162 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baab228-31e7-4247-bd51-ea47ab5247bf" containerName="extract-utilities" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.152169 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baab228-31e7-4247-bd51-ea47ab5247bf" containerName="extract-utilities" Dec 04 11:30:00 crc kubenswrapper[4693]: E1204 11:30:00.152197 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96fe5a48-f46f-4397-973d-87b60628ba4b" containerName="copy" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.152204 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="96fe5a48-f46f-4397-973d-87b60628ba4b" containerName="copy" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.152439 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="96fe5a48-f46f-4397-973d-87b60628ba4b" containerName="gather" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.152458 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="96fe5a48-f46f-4397-973d-87b60628ba4b" containerName="copy" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.152469 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="8baab228-31e7-4247-bd51-ea47ab5247bf" containerName="registry-server" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.153183 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-vl4n9" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.159659 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.161966 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.169452 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414130-vl4n9"] Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.350981 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be8854c3-8b79-4b85-ae36-562ea6cc4af9-config-volume\") pod \"collect-profiles-29414130-vl4n9\" (UID: \"be8854c3-8b79-4b85-ae36-562ea6cc4af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-vl4n9" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.351078 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85vw4\" (UniqueName: \"kubernetes.io/projected/be8854c3-8b79-4b85-ae36-562ea6cc4af9-kube-api-access-85vw4\") pod \"collect-profiles-29414130-vl4n9\" (UID: \"be8854c3-8b79-4b85-ae36-562ea6cc4af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-vl4n9" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.351301 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be8854c3-8b79-4b85-ae36-562ea6cc4af9-secret-volume\") pod \"collect-profiles-29414130-vl4n9\" (UID: \"be8854c3-8b79-4b85-ae36-562ea6cc4af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-vl4n9" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.452840 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be8854c3-8b79-4b85-ae36-562ea6cc4af9-secret-volume\") pod \"collect-profiles-29414130-vl4n9\" (UID: \"be8854c3-8b79-4b85-ae36-562ea6cc4af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-vl4n9" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.452963 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be8854c3-8b79-4b85-ae36-562ea6cc4af9-config-volume\") pod \"collect-profiles-29414130-vl4n9\" (UID: \"be8854c3-8b79-4b85-ae36-562ea6cc4af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-vl4n9" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.453006 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85vw4\" (UniqueName: \"kubernetes.io/projected/be8854c3-8b79-4b85-ae36-562ea6cc4af9-kube-api-access-85vw4\") pod \"collect-profiles-29414130-vl4n9\" (UID: \"be8854c3-8b79-4b85-ae36-562ea6cc4af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-vl4n9" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.454472 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be8854c3-8b79-4b85-ae36-562ea6cc4af9-config-volume\") pod \"collect-profiles-29414130-vl4n9\" (UID: \"be8854c3-8b79-4b85-ae36-562ea6cc4af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-vl4n9" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.459986 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be8854c3-8b79-4b85-ae36-562ea6cc4af9-secret-volume\") pod \"collect-profiles-29414130-vl4n9\" (UID: \"be8854c3-8b79-4b85-ae36-562ea6cc4af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-vl4n9" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.481101 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85vw4\" (UniqueName: \"kubernetes.io/projected/be8854c3-8b79-4b85-ae36-562ea6cc4af9-kube-api-access-85vw4\") pod \"collect-profiles-29414130-vl4n9\" (UID: \"be8854c3-8b79-4b85-ae36-562ea6cc4af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-vl4n9" Dec 04 11:30:00 crc kubenswrapper[4693]: I1204 11:30:00.776657 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-vl4n9" Dec 04 11:30:01 crc kubenswrapper[4693]: W1204 11:30:01.211838 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe8854c3_8b79_4b85_ae36_562ea6cc4af9.slice/crio-d18e7697066aa92623bd8a728d5cf97981de45bc156c54cfbd97ad429f11f9ab WatchSource:0}: Error finding container d18e7697066aa92623bd8a728d5cf97981de45bc156c54cfbd97ad429f11f9ab: Status 404 returned error can't find the container with id d18e7697066aa92623bd8a728d5cf97981de45bc156c54cfbd97ad429f11f9ab Dec 04 11:30:01 crc kubenswrapper[4693]: I1204 11:30:01.216971 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414130-vl4n9"] Dec 04 11:30:01 crc kubenswrapper[4693]: I1204 11:30:01.788081 4693 generic.go:334] "Generic (PLEG): container finished" podID="be8854c3-8b79-4b85-ae36-562ea6cc4af9" containerID="b419ba500047050ec5e53dc79496b707c852f0efcf681e5d900aed5a36f1e8a5" exitCode=0 Dec 04 11:30:01 crc kubenswrapper[4693]: I1204 11:30:01.788521 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-vl4n9" event={"ID":"be8854c3-8b79-4b85-ae36-562ea6cc4af9","Type":"ContainerDied","Data":"b419ba500047050ec5e53dc79496b707c852f0efcf681e5d900aed5a36f1e8a5"} Dec 04 11:30:01 crc kubenswrapper[4693]: I1204 11:30:01.788558 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-vl4n9" event={"ID":"be8854c3-8b79-4b85-ae36-562ea6cc4af9","Type":"ContainerStarted","Data":"d18e7697066aa92623bd8a728d5cf97981de45bc156c54cfbd97ad429f11f9ab"} Dec 04 11:30:03 crc kubenswrapper[4693]: I1204 11:30:03.130269 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-vl4n9" Dec 04 11:30:03 crc kubenswrapper[4693]: I1204 11:30:03.312034 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be8854c3-8b79-4b85-ae36-562ea6cc4af9-secret-volume\") pod \"be8854c3-8b79-4b85-ae36-562ea6cc4af9\" (UID: \"be8854c3-8b79-4b85-ae36-562ea6cc4af9\") " Dec 04 11:30:03 crc kubenswrapper[4693]: I1204 11:30:03.312072 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85vw4\" (UniqueName: \"kubernetes.io/projected/be8854c3-8b79-4b85-ae36-562ea6cc4af9-kube-api-access-85vw4\") pod \"be8854c3-8b79-4b85-ae36-562ea6cc4af9\" (UID: \"be8854c3-8b79-4b85-ae36-562ea6cc4af9\") " Dec 04 11:30:03 crc kubenswrapper[4693]: I1204 11:30:03.312239 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be8854c3-8b79-4b85-ae36-562ea6cc4af9-config-volume\") pod \"be8854c3-8b79-4b85-ae36-562ea6cc4af9\" (UID: \"be8854c3-8b79-4b85-ae36-562ea6cc4af9\") " Dec 04 11:30:03 crc kubenswrapper[4693]: I1204 11:30:03.313440 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be8854c3-8b79-4b85-ae36-562ea6cc4af9-config-volume" (OuterVolumeSpecName: "config-volume") pod "be8854c3-8b79-4b85-ae36-562ea6cc4af9" (UID: "be8854c3-8b79-4b85-ae36-562ea6cc4af9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 11:30:03 crc kubenswrapper[4693]: I1204 11:30:03.318831 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be8854c3-8b79-4b85-ae36-562ea6cc4af9-kube-api-access-85vw4" (OuterVolumeSpecName: "kube-api-access-85vw4") pod "be8854c3-8b79-4b85-ae36-562ea6cc4af9" (UID: "be8854c3-8b79-4b85-ae36-562ea6cc4af9"). InnerVolumeSpecName "kube-api-access-85vw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:30:03 crc kubenswrapper[4693]: I1204 11:30:03.325254 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be8854c3-8b79-4b85-ae36-562ea6cc4af9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "be8854c3-8b79-4b85-ae36-562ea6cc4af9" (UID: "be8854c3-8b79-4b85-ae36-562ea6cc4af9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:30:03 crc kubenswrapper[4693]: I1204 11:30:03.414521 4693 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be8854c3-8b79-4b85-ae36-562ea6cc4af9-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 11:30:03 crc kubenswrapper[4693]: I1204 11:30:03.414555 4693 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be8854c3-8b79-4b85-ae36-562ea6cc4af9-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 11:30:03 crc kubenswrapper[4693]: I1204 11:30:03.414565 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85vw4\" (UniqueName: \"kubernetes.io/projected/be8854c3-8b79-4b85-ae36-562ea6cc4af9-kube-api-access-85vw4\") on node \"crc\" DevicePath \"\"" Dec 04 11:30:03 crc kubenswrapper[4693]: I1204 11:30:03.819889 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-vl4n9" event={"ID":"be8854c3-8b79-4b85-ae36-562ea6cc4af9","Type":"ContainerDied","Data":"d18e7697066aa92623bd8a728d5cf97981de45bc156c54cfbd97ad429f11f9ab"} Dec 04 11:30:03 crc kubenswrapper[4693]: I1204 11:30:03.819950 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d18e7697066aa92623bd8a728d5cf97981de45bc156c54cfbd97ad429f11f9ab" Dec 04 11:30:03 crc kubenswrapper[4693]: I1204 11:30:03.820026 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414130-vl4n9" Dec 04 11:30:04 crc kubenswrapper[4693]: I1204 11:30:04.214209 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl"] Dec 04 11:30:04 crc kubenswrapper[4693]: I1204 11:30:04.222396 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414085-xg6bl"] Dec 04 11:30:04 crc kubenswrapper[4693]: I1204 11:30:04.474736 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8fa56cf-2051-47fc-9ac0-d28653185bd6" path="/var/lib/kubelet/pods/e8fa56cf-2051-47fc-9ac0-d28653185bd6/volumes" Dec 04 11:30:13 crc kubenswrapper[4693]: I1204 11:30:13.194390 4693 scope.go:117] "RemoveContainer" containerID="f550bb027b26b10e52780da2ed9403c4fdb175c0c625d2c8146d2d07800a46a3" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.058381 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6kbl7"] Dec 04 11:30:35 crc kubenswrapper[4693]: E1204 11:30:35.059302 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be8854c3-8b79-4b85-ae36-562ea6cc4af9" containerName="collect-profiles" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.059315 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="be8854c3-8b79-4b85-ae36-562ea6cc4af9" containerName="collect-profiles" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.059516 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="be8854c3-8b79-4b85-ae36-562ea6cc4af9" containerName="collect-profiles" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.061098 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6kbl7" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.075628 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6kbl7"] Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.081441 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01d340ee-a61d-485d-9a9e-c9628ffc702e-utilities\") pod \"certified-operators-6kbl7\" (UID: \"01d340ee-a61d-485d-9a9e-c9628ffc702e\") " pod="openshift-marketplace/certified-operators-6kbl7" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.081525 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01d340ee-a61d-485d-9a9e-c9628ffc702e-catalog-content\") pod \"certified-operators-6kbl7\" (UID: \"01d340ee-a61d-485d-9a9e-c9628ffc702e\") " pod="openshift-marketplace/certified-operators-6kbl7" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.081647 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf22r\" (UniqueName: \"kubernetes.io/projected/01d340ee-a61d-485d-9a9e-c9628ffc702e-kube-api-access-wf22r\") pod \"certified-operators-6kbl7\" (UID: \"01d340ee-a61d-485d-9a9e-c9628ffc702e\") " pod="openshift-marketplace/certified-operators-6kbl7" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.183508 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01d340ee-a61d-485d-9a9e-c9628ffc702e-utilities\") pod \"certified-operators-6kbl7\" (UID: \"01d340ee-a61d-485d-9a9e-c9628ffc702e\") " pod="openshift-marketplace/certified-operators-6kbl7" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.183614 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01d340ee-a61d-485d-9a9e-c9628ffc702e-catalog-content\") pod \"certified-operators-6kbl7\" (UID: \"01d340ee-a61d-485d-9a9e-c9628ffc702e\") " pod="openshift-marketplace/certified-operators-6kbl7" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.183770 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf22r\" (UniqueName: \"kubernetes.io/projected/01d340ee-a61d-485d-9a9e-c9628ffc702e-kube-api-access-wf22r\") pod \"certified-operators-6kbl7\" (UID: \"01d340ee-a61d-485d-9a9e-c9628ffc702e\") " pod="openshift-marketplace/certified-operators-6kbl7" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.184258 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01d340ee-a61d-485d-9a9e-c9628ffc702e-catalog-content\") pod \"certified-operators-6kbl7\" (UID: \"01d340ee-a61d-485d-9a9e-c9628ffc702e\") " pod="openshift-marketplace/certified-operators-6kbl7" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.184258 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01d340ee-a61d-485d-9a9e-c9628ffc702e-utilities\") pod \"certified-operators-6kbl7\" (UID: \"01d340ee-a61d-485d-9a9e-c9628ffc702e\") " pod="openshift-marketplace/certified-operators-6kbl7" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.209129 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf22r\" (UniqueName: \"kubernetes.io/projected/01d340ee-a61d-485d-9a9e-c9628ffc702e-kube-api-access-wf22r\") pod \"certified-operators-6kbl7\" (UID: \"01d340ee-a61d-485d-9a9e-c9628ffc702e\") " pod="openshift-marketplace/certified-operators-6kbl7" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.385558 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6kbl7" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.666407 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dcdgv"] Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.669189 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dcdgv" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.700528 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b-catalog-content\") pod \"redhat-operators-dcdgv\" (UID: \"3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b\") " pod="openshift-marketplace/redhat-operators-dcdgv" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.700637 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtbm6\" (UniqueName: \"kubernetes.io/projected/3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b-kube-api-access-qtbm6\") pod \"redhat-operators-dcdgv\" (UID: \"3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b\") " pod="openshift-marketplace/redhat-operators-dcdgv" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.700685 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b-utilities\") pod \"redhat-operators-dcdgv\" (UID: \"3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b\") " pod="openshift-marketplace/redhat-operators-dcdgv" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.703129 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dcdgv"] Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.802726 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtbm6\" (UniqueName: \"kubernetes.io/projected/3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b-kube-api-access-qtbm6\") pod \"redhat-operators-dcdgv\" (UID: \"3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b\") " pod="openshift-marketplace/redhat-operators-dcdgv" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.803425 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b-utilities\") pod \"redhat-operators-dcdgv\" (UID: \"3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b\") " pod="openshift-marketplace/redhat-operators-dcdgv" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.803937 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b-utilities\") pod \"redhat-operators-dcdgv\" (UID: \"3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b\") " pod="openshift-marketplace/redhat-operators-dcdgv" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.804246 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b-catalog-content\") pod \"redhat-operators-dcdgv\" (UID: \"3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b\") " pod="openshift-marketplace/redhat-operators-dcdgv" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.804568 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b-catalog-content\") pod \"redhat-operators-dcdgv\" (UID: \"3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b\") " pod="openshift-marketplace/redhat-operators-dcdgv" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.835915 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtbm6\" (UniqueName: \"kubernetes.io/projected/3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b-kube-api-access-qtbm6\") pod \"redhat-operators-dcdgv\" (UID: \"3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b\") " pod="openshift-marketplace/redhat-operators-dcdgv" Dec 04 11:30:35 crc kubenswrapper[4693]: I1204 11:30:35.988408 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6kbl7"] Dec 04 11:30:36 crc kubenswrapper[4693]: I1204 11:30:36.020960 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dcdgv" Dec 04 11:30:36 crc kubenswrapper[4693]: I1204 11:30:36.165590 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kbl7" event={"ID":"01d340ee-a61d-485d-9a9e-c9628ffc702e","Type":"ContainerStarted","Data":"82f6df5ced888fb909ec6be3ff86ddf34fbe3fbadd9e0b313270e5e209161a4a"} Dec 04 11:30:36 crc kubenswrapper[4693]: I1204 11:30:36.542105 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dcdgv"] Dec 04 11:30:37 crc kubenswrapper[4693]: I1204 11:30:37.184581 4693 generic.go:334] "Generic (PLEG): container finished" podID="01d340ee-a61d-485d-9a9e-c9628ffc702e" containerID="f229d9925624cb98157bf06599c784ee281d6e56c7d76c1be783bbbada816da0" exitCode=0 Dec 04 11:30:37 crc kubenswrapper[4693]: I1204 11:30:37.184637 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kbl7" event={"ID":"01d340ee-a61d-485d-9a9e-c9628ffc702e","Type":"ContainerDied","Data":"f229d9925624cb98157bf06599c784ee281d6e56c7d76c1be783bbbada816da0"} Dec 04 11:30:37 crc kubenswrapper[4693]: I1204 11:30:37.186662 4693 generic.go:334] "Generic (PLEG): container finished" podID="3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b" containerID="b861aa8c0c8a59b779af59655be7aad76ef9d326ec88031686af90085593cc1b" exitCode=0 Dec 04 11:30:37 crc kubenswrapper[4693]: I1204 11:30:37.186704 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dcdgv" event={"ID":"3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b","Type":"ContainerDied","Data":"b861aa8c0c8a59b779af59655be7aad76ef9d326ec88031686af90085593cc1b"} Dec 04 11:30:37 crc kubenswrapper[4693]: I1204 11:30:37.186735 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dcdgv" event={"ID":"3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b","Type":"ContainerStarted","Data":"f0d262eff96f8b6e386432920adcb8b0789402e577eba29f64cddef6f2601606"} Dec 04 11:30:37 crc kubenswrapper[4693]: I1204 11:30:37.446042 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b44p5"] Dec 04 11:30:37 crc kubenswrapper[4693]: I1204 11:30:37.448056 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b44p5" Dec 04 11:30:37 crc kubenswrapper[4693]: I1204 11:30:37.465922 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b44p5"] Dec 04 11:30:37 crc kubenswrapper[4693]: I1204 11:30:37.642978 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/008f3152-4fa6-495a-8cee-123e4e448111-utilities\") pod \"community-operators-b44p5\" (UID: \"008f3152-4fa6-495a-8cee-123e4e448111\") " pod="openshift-marketplace/community-operators-b44p5" Dec 04 11:30:37 crc kubenswrapper[4693]: I1204 11:30:37.643769 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/008f3152-4fa6-495a-8cee-123e4e448111-catalog-content\") pod \"community-operators-b44p5\" (UID: \"008f3152-4fa6-495a-8cee-123e4e448111\") " pod="openshift-marketplace/community-operators-b44p5" Dec 04 11:30:37 crc kubenswrapper[4693]: I1204 11:30:37.643810 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k25st\" (UniqueName: \"kubernetes.io/projected/008f3152-4fa6-495a-8cee-123e4e448111-kube-api-access-k25st\") pod \"community-operators-b44p5\" (UID: \"008f3152-4fa6-495a-8cee-123e4e448111\") " pod="openshift-marketplace/community-operators-b44p5" Dec 04 11:30:37 crc kubenswrapper[4693]: I1204 11:30:37.745001 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/008f3152-4fa6-495a-8cee-123e4e448111-utilities\") pod \"community-operators-b44p5\" (UID: \"008f3152-4fa6-495a-8cee-123e4e448111\") " pod="openshift-marketplace/community-operators-b44p5" Dec 04 11:30:37 crc kubenswrapper[4693]: I1204 11:30:37.745160 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/008f3152-4fa6-495a-8cee-123e4e448111-catalog-content\") pod \"community-operators-b44p5\" (UID: \"008f3152-4fa6-495a-8cee-123e4e448111\") " pod="openshift-marketplace/community-operators-b44p5" Dec 04 11:30:37 crc kubenswrapper[4693]: I1204 11:30:37.745178 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k25st\" (UniqueName: \"kubernetes.io/projected/008f3152-4fa6-495a-8cee-123e4e448111-kube-api-access-k25st\") pod \"community-operators-b44p5\" (UID: \"008f3152-4fa6-495a-8cee-123e4e448111\") " pod="openshift-marketplace/community-operators-b44p5" Dec 04 11:30:37 crc kubenswrapper[4693]: I1204 11:30:37.745667 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/008f3152-4fa6-495a-8cee-123e4e448111-catalog-content\") pod \"community-operators-b44p5\" (UID: \"008f3152-4fa6-495a-8cee-123e4e448111\") " pod="openshift-marketplace/community-operators-b44p5" Dec 04 11:30:37 crc kubenswrapper[4693]: I1204 11:30:37.745676 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/008f3152-4fa6-495a-8cee-123e4e448111-utilities\") pod \"community-operators-b44p5\" (UID: \"008f3152-4fa6-495a-8cee-123e4e448111\") " pod="openshift-marketplace/community-operators-b44p5" Dec 04 11:30:37 crc kubenswrapper[4693]: I1204 11:30:37.790056 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k25st\" (UniqueName: \"kubernetes.io/projected/008f3152-4fa6-495a-8cee-123e4e448111-kube-api-access-k25st\") pod \"community-operators-b44p5\" (UID: \"008f3152-4fa6-495a-8cee-123e4e448111\") " pod="openshift-marketplace/community-operators-b44p5" Dec 04 11:30:38 crc kubenswrapper[4693]: I1204 11:30:38.068859 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b44p5" Dec 04 11:30:38 crc kubenswrapper[4693]: I1204 11:30:38.201126 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kbl7" event={"ID":"01d340ee-a61d-485d-9a9e-c9628ffc702e","Type":"ContainerStarted","Data":"77edc6c0cbf62f17db4c55fe3b8caa49c11dc3c8cdd53a45386fa62201ba18cd"} Dec 04 11:30:38 crc kubenswrapper[4693]: I1204 11:30:38.203891 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dcdgv" event={"ID":"3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b","Type":"ContainerStarted","Data":"308b271c5d2dffb180b9d5c803e3dd31d425466b6ec38c0714a0b39f799b7af0"} Dec 04 11:30:38 crc kubenswrapper[4693]: W1204 11:30:38.550026 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod008f3152_4fa6_495a_8cee_123e4e448111.slice/crio-70458348ae7990915ad61bfea50f89cdc42ac14f7db04b735dbd7086b0aeaa99 WatchSource:0}: Error finding container 70458348ae7990915ad61bfea50f89cdc42ac14f7db04b735dbd7086b0aeaa99: Status 404 returned error can't find the container with id 70458348ae7990915ad61bfea50f89cdc42ac14f7db04b735dbd7086b0aeaa99 Dec 04 11:30:38 crc kubenswrapper[4693]: I1204 11:30:38.551830 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b44p5"] Dec 04 11:30:39 crc kubenswrapper[4693]: I1204 11:30:39.214284 4693 generic.go:334] "Generic (PLEG): container finished" podID="008f3152-4fa6-495a-8cee-123e4e448111" containerID="e7a740becbf2e99ae8d2d48715b3eb7e84682e4f501e9f58cd722987a3b65811" exitCode=0 Dec 04 11:30:39 crc kubenswrapper[4693]: I1204 11:30:39.214379 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b44p5" event={"ID":"008f3152-4fa6-495a-8cee-123e4e448111","Type":"ContainerDied","Data":"e7a740becbf2e99ae8d2d48715b3eb7e84682e4f501e9f58cd722987a3b65811"} Dec 04 11:30:39 crc kubenswrapper[4693]: I1204 11:30:39.215110 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b44p5" event={"ID":"008f3152-4fa6-495a-8cee-123e4e448111","Type":"ContainerStarted","Data":"70458348ae7990915ad61bfea50f89cdc42ac14f7db04b735dbd7086b0aeaa99"} Dec 04 11:30:40 crc kubenswrapper[4693]: I1204 11:30:40.225988 4693 generic.go:334] "Generic (PLEG): container finished" podID="3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b" containerID="308b271c5d2dffb180b9d5c803e3dd31d425466b6ec38c0714a0b39f799b7af0" exitCode=0 Dec 04 11:30:40 crc kubenswrapper[4693]: I1204 11:30:40.226118 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dcdgv" event={"ID":"3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b","Type":"ContainerDied","Data":"308b271c5d2dffb180b9d5c803e3dd31d425466b6ec38c0714a0b39f799b7af0"} Dec 04 11:30:40 crc kubenswrapper[4693]: I1204 11:30:40.229598 4693 generic.go:334] "Generic (PLEG): container finished" podID="01d340ee-a61d-485d-9a9e-c9628ffc702e" containerID="77edc6c0cbf62f17db4c55fe3b8caa49c11dc3c8cdd53a45386fa62201ba18cd" exitCode=0 Dec 04 11:30:40 crc kubenswrapper[4693]: I1204 11:30:40.229676 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kbl7" event={"ID":"01d340ee-a61d-485d-9a9e-c9628ffc702e","Type":"ContainerDied","Data":"77edc6c0cbf62f17db4c55fe3b8caa49c11dc3c8cdd53a45386fa62201ba18cd"} Dec 04 11:30:40 crc kubenswrapper[4693]: I1204 11:30:40.233118 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b44p5" event={"ID":"008f3152-4fa6-495a-8cee-123e4e448111","Type":"ContainerStarted","Data":"f6383fe8dcf8156344bb3bde62d76db349491f7e947bf797414356cc19fb5077"} Dec 04 11:30:42 crc kubenswrapper[4693]: I1204 11:30:42.257761 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dcdgv" event={"ID":"3dc4d62e-ac69-41fb-be1d-3d5aed63cf8b","Type":"ContainerStarted","Data":"5f45ce31d02e8d661f5fac351f94e259d1b372b410a5868cd04b060a25d13d83"} Dec 04 11:30:42 crc kubenswrapper[4693]: I1204 11:30:42.260468 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kbl7" event={"ID":"01d340ee-a61d-485d-9a9e-c9628ffc702e","Type":"ContainerStarted","Data":"50b0cd75857f91bda2c2f4188cd84c277afdbb845a8963fd99b3b828501aadd3"} Dec 04 11:30:42 crc kubenswrapper[4693]: I1204 11:30:42.283072 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dcdgv" podStartSLOduration=3.612820292 podStartE2EDuration="7.28305059s" podCreationTimestamp="2025-12-04 11:30:35 +0000 UTC" firstStartedPulling="2025-12-04 11:30:37.187975941 +0000 UTC m=+6483.085569694" lastFinishedPulling="2025-12-04 11:30:40.858206229 +0000 UTC m=+6486.755799992" observedRunningTime="2025-12-04 11:30:42.274437568 +0000 UTC m=+6488.172031321" watchObservedRunningTime="2025-12-04 11:30:42.28305059 +0000 UTC m=+6488.180644343" Dec 04 11:30:42 crc kubenswrapper[4693]: I1204 11:30:42.298471 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6kbl7" podStartSLOduration=3.5314591699999998 podStartE2EDuration="7.298451275s" podCreationTimestamp="2025-12-04 11:30:35 +0000 UTC" firstStartedPulling="2025-12-04 11:30:37.186237914 +0000 UTC m=+6483.083831667" lastFinishedPulling="2025-12-04 11:30:40.953230019 +0000 UTC m=+6486.850823772" observedRunningTime="2025-12-04 11:30:42.292433423 +0000 UTC m=+6488.190027176" watchObservedRunningTime="2025-12-04 11:30:42.298451275 +0000 UTC m=+6488.196045028"